datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
HeshamHaroon/oasst-arabic
--- dataset_info: features: - name: message_id dtype: string - name: parent_id dtype: string - name: user_id dtype: string - name: created_date dtype: string - name: text dtype: string - name: role dtype: string - name: lang dtype: string - name: review_count dtype: int64 - name: review_result dtype: bool - name: deleted dtype: bool - name: rank dtype: float64 - name: synthetic dtype: bool - name: model_name dtype: 'null' - name: detoxify struct: - name: identity_attack dtype: float64 - name: insult dtype: float64 - name: obscene dtype: float64 - name: severe_toxicity dtype: float64 - name: sexual_explicit dtype: float64 - name: threat dtype: float64 - name: toxicity dtype: float64 - name: message_tree_id dtype: string - name: tree_state dtype: string - name: emojis struct: - name: count sequence: int64 - name: name sequence: string - name: labels struct: - name: count sequence: int64 - name: name sequence: string - name: value sequence: float64 splits: - name: train num_bytes: 103278547 num_examples: 84436 - name: validation num_bytes: 5361928 num_examples: 4400 download_size: 36303557 dataset_size: 108640475 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
ayah-kamal/elsevier-annotated-min
--- task_categories: - text-classification language: - en pretty_name: Elsevier Shapeless Sentence Classification size_categories: - n<1K --- References: Daniel, R. (Creator), Groth, P. (Creator), Scerri, A. (Creator), Harper, C. A. (Creator), Vandenbussche, P. (Creator), Cox, J. (Creator) (2015). An Open Access Corpus of Scientific, Technical, and Medical Content. Github.
hlt-lab/dreamsample-negate_previous_utterance
--- dataset_info: features: - name: context dtype: string - name: response dtype: string - name: reference dtype: string splits: - name: train num_bytes: 50660 num_examples: 100 download_size: 37581 dataset_size: 50660 --- # Dataset Card for "dreamsample-negate_previous_utterance" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mehmetsozsoy/Orhan_Ozsoy_Voice_Model
--- license: cc-by-nc-2.0 ---
arthurmluz/GPTextSum2_data-xlsum_cstnews_results
--- dataset_info: features: - name: id dtype: int64 - name: text dtype: string - name: summary dtype: string - name: gen_summary dtype: string - name: rouge struct: - name: rouge1 dtype: float64 - name: rouge2 dtype: float64 - name: rougeL dtype: float64 - name: rougeLsum dtype: float64 - name: bert struct: - name: f1 sequence: float64 - name: hashcode dtype: string - name: precision sequence: float64 - name: recall sequence: float64 - name: moverScore dtype: float64 splits: - name: validation num_bytes: 90784 num_examples: 20 download_size: 88781 dataset_size: 90784 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "gptextsum2_data-xlsum_cstnews_results" rouge= {'rouge1': 0.38748303853813626, 'rouge2': 0.18195048965428265, 'rougeL': 0.24222310213649534, 'rougeLsum': 0.24222310213649534} bert= {'precision': 0.7680569976568222, 'recall': 0.7077599495649338, 'f1': 0.7364159941673278} mover = 0.630959465845996
EleutherAI/quirky_nli_alice_easy
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: label dtype: class_label: names: '0': entailment '1': neutral '2': contradiction - name: id dtype: string - name: choices sequence: string - name: bob_label dtype: int64 - name: difficulty dtype: float64 - name: statement dtype: string - name: character dtype: string - name: alice_label dtype: int64 splits: - name: train num_bytes: 331205.67582760775 num_examples: 1401 - name: validation num_bytes: 117898.06075 num_examples: 491 - name: test num_bytes: 108771.4505 num_examples: 458 download_size: 226698 dataset_size: 557875.1870776077 --- # Dataset Card for "quirky_nli_alice_easy" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pop2189/document_ai
--- license: apache-2.0 ---
316usman/thematic3a-1-usman
--- dataset_info: features: - name: text dtype: string - name: document_url dtype: string - name: source_url dtype: string - name: country dtype: string splits: - name: train num_bytes: 88699781 num_examples: 143617 download_size: 31991286 dataset_size: 88699781 configs: - config_name: default data_files: - split: train path: data/train-* ---
ipipan/kashubian-wikipedia-clean-20230901
--- license: cc-by-sa-4.0 --- # Model Card for Clean Kashubian Wikipedia This is a cleaned and filtered snapshot of 20230901 Kashubian Wikipedia. ## License CC BY-SA 4.0 ## Citation If you use this model, please cite the following paper: ``` @misc{rybak2024transferring, title={Transferring BERT Capabilities from High-Resource to Low-Resource Languages Using Vocabulary Matching}, author={Piotr Rybak}, year={2024}, eprint={2402.14408}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Authors The model was created by Piotr Rybak from [Linguistic Engineering Group at Institute of Computer Science, Polish Academy of Sciences](http://zil.ipipan.waw.pl/). This work was supported by the European Regional Development Fund as a part of 2014–2020 Smart Growth Operational Programme, CLARIN — Common Language Resources and Technology Infrastructure, project no. POIR.04.02.00-00C002/19.
cawoylel/FulaSpeechCorpora
--- configs: - config_name: default data_files: - split: pulaar path: data/pulaar-* - split: maacina path: data/maacina-* - split: liptako path: data/liptako-* - split: caka path: data/caka-* - split: bororro path: data/bororro-* - split: borgu path: data/borgu-* - split: pular path: data/pular-* - split: adamawa path: data/adamawa-* dataset_info: features: - name: audio dtype: audio - name: transcription dtype: string - name: dialect dtype: string splits: - name: pulaar num_bytes: 3398551955.96 num_examples: 12880 - name: maacina num_bytes: 2677353337.824 num_examples: 14336 - name: liptako num_bytes: 5858678478.536 num_examples: 36828 - name: caka num_bytes: 2790732470.205 num_examples: 14865 - name: bororro num_bytes: 2952498447.936 num_examples: 15022 - name: borgu num_bytes: 2849809213.278 num_examples: 13387 - name: pular num_bytes: 2339299211.055 num_examples: 11779 - name: adamawa num_bytes: 2225350403.136 num_examples: 13504 download_size: 20035287564 dataset_size: 25092273517.93 task_categories: - automatic-speech-recognition - text-to-speech - audio-classification language: - ff pretty_name: Fula Multidialectal Speech Corpora size_categories: - 100K<n<1M tags: - speech - low-ressource - audio --- # Purpose of the dataset Fula is a language spoken in at least 16 countries with many dialectal variaties. However, The few NLP solutions that exist take only one dialect into account, often the Nigerian one. This is a speech-text dataset for 8 dialectal varieties of Fula, allowing the full diversity of the Fula language to be taken into account in the development of NLP solutions. # Fula varieties in this dataset This dataset contains 8 varrieties: - __Pulaar__: spoken in Senegal, Mauritania and West-Mali. - __Pular__: spoken in Guinea. - __Maacina__: spoken in the Center and East of Mali. - __Liptako__: spoken in Burkina Faso and Niger. - __Caka__: spoken in the Central Nigeria. - __Bororro__: a very nomad group living in Cameroon, Central African Republic and Tchad. - __Borgu__: spoken in Togo and Benin. - __Adamawa__: spoken in Cameroon and South-East Nigeria. # Sources of the data Many of the corpora are from books automatically aligned using the [MMS Aligner](https://github.com/facebookresearch/fairseq/tree/main/examples/mms/data_prep). You can check the script for scraping and aligning the corpora in the github repository [https://github.com/cawoylel/FulaSpeechCorpora](https://github.com/cawoylel/FulaSpeechCorpora) For each variety, we give the source: - __Pulaar__: - We automatically align books from https://deftepulaar.com/ - We also added the Waxal Dataset from [https://huggingface.co/datasets/galsenai/waxal_dataset](https://huggingface.co/datasets/galsenai/waxal_dataset) - __Pular__: We automatically align the bible books from [https://www.bible.com/bible/1798/MAT.1.VPFJ](https://www.bible.com/bible/1798/MAT.1.VPFJ) - __Maacina__: We automatically align the bible books from [https://www.bible.com/bible/1175/MAT.1.FFM](https://www.bible.com/bible/1175/MAT.1.FFM) - __Liptako__: - We automatically align the bible books from [https://www.bible.com/bible/1032/MAT.1.FBFNT](https://www.bible.com/bible/1032/MAT.1.FBFNT) - We added data from the dictionary [https://www.webonary.org/fulfuldeburkina/?lang=en](Fulfulde\_Webonary\_Dictionary) - We scraped many pages from [https://media.ipsapps.org](https://media.ipsapps.org), example of page: [https://media.ipsapps.org/fuh/ora/co1/01-B001-001.html](https://media.ipsapps.org/fuh/ora/co1/01-B001-001.html) - __Caka__: We automatically align the bible books from [https://www.bible.com/bible/1159/MAT.1.FUV](Bible) - __Bororro__: We automatically align the bible books from [https://www.bible.com/bible/1373/MAT.1.FUQ](https://www.bible.com/bible/1373/MAT.1.FUQ) - __Borgu__: We automatically align the bible books from [https://www.bible.com/bible/3088/MAT.1.BFB](https://www.bible.com/bible/3088/MAT.1.BFB) - __Adamawa__: We automatically align the bible books from [https://www.bible.com/bible/906/MAT.1.FB](https://www.bible.com/bible/906/MAT.1.FB) # How to use The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function. For example, to download the Borgu config, simply specify the corresponding variety config name: ```python from datasets import load_dataset borgu_data = load_dataset("cawoylel/FulaSpeechCorpora", "borgu") ``` You can also load all the dataset: ```python from datasets import load_dataset data = load_dataset("cawoylel/FulaSpeechCorpora") ``` Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk. ```python from datasets import load_dataset data = load_dataset("cawoylel/FulaSpeechCorpora", streaming=True) print(next(iter(data))) ``` # Data Fields The data fields are the same among all splits. - **dialect** (str): The name of the dialect - **audio** (dict): Audio object including loaded audio array, sampling rate and path to audio - **transcription** (str): Transcription of the audio file # Social Impact of Dataset As many African languages, Fula is under-represented in NLP solutions. The dataset aims to bring more linguistic diversity. # Discussion of Biases Les corpus sont principalement issus de livres lus et enregistrés en studio dans conditions non bruités et avec une certaine hyper-articulation des lecteurs. Les modèles entraînés avec ces données peuvent être moins robuste au bruit et à la parole instant. Moreover, most of the speakers are adult males, which may pose problems for generalizing the models to other types of speakers. # Limitations Read speech, hyper-articulation, noise robustness, etc # Additional Information All datasets are licensed under the [Creative Commons license (CC-BY)](https://creativecommons.org/licenses/). # Citation Information Please cite us when using the FulaSpeechCorpora: ``` @article{fleurs2022arxiv, title = {FulaSpeechCorpora: A multidialectal speech dataset for Fula.}, author = {Sy, Yaya and Doucouré, Dioula}, url = {https://huggingface.co/datasets/cawoylel/FulaSpeechCorpora}, year = {2023}, ```
jamestalentium/xsum_250_rm
--- dataset_info: features: - name: input_text dtype: string - name: output_text dtype: string - name: id dtype: string splits: - name: train num_bytes: 587133.1850817222 num_examples: 250 download_size: 214345 dataset_size: 587133.1850817222 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "xsum_250_rm" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
311-data/2024
--- license: gpl-3.0 ---
PCScreen/ThomazJunior1
--- license: unknown ---
AI-Sweden/SuperLim
--- language: - sv multilinguality: - monolingual pretty_name: SuperLim task_categories: - question-answering - text-classification - sequence-modeling - other --- # Dataset Card for SuperLim ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Structure/Creation/Use/Additional Information](#dataset-structurecreationuseadditional-information) - [Dalaj](#dalaj) - [SweAna](#sweana) - [SweDiag](#swediag) - [SweFaq](#swefaq) - [SweFracas](#swefracas) - [SwePar](#swepar) - [SweSat](#swesat) - [SweSim](#swesim) - [SweWgr](#swewgr) - [SweWic](#swewic) - [SweWsc](#swewsc) ## Dataset Description - **Homepage:** [Språkbanken](https://spraakbanken.gu.se/en/resources/superlim) - **Repository:** / - **Paper:** / - **Leaderboard:** / - **Point of Contact:** [Contact Us](mailto:severine.verlinden@ai.se) ### Dataset Summary A standardized suite for evaluation and analysis of Swedish natural language understanding systems. ### Supported Tasks and Leaderboards Work in progress ### Languages Swedish ## Dataset Structure/Creation/Use/Additional Information ### Dalaj [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/dalaj/dalaj_documentation.tsv) ### SweAna [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/swedish_analogy/analogy_documentation_sheet.tsv) #### SweDiag work in progress ### SweFaq [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/faq/faq_documentation_sheet.tsv) ### SweFracas [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/swefracas/swefracas_documentation_sheet.tsv) ### SwePar [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/sweparaphrase/sweparaphrase_documentation.tsv) ### SweSat [dataset documentation](https://svn.spraakdata.gu.se/sb-arkiv/pub/swesat/swesat-synonyms_documentation_sheet.tsv) ### SweSim [dataset documentation](https://demo.spraakbanken.gu.se/gerlof/SuperSim/supersim-superlim_documentation_sheet.txt) ### SweWgr [dataset documentation](https://demo.spraakbanken.gu.se/gerlof/SweWinogender/swewinogender_documentation_sheet.txt) ### SweWic [dataset documentation](https://demo.spraakbanken.gu.se/gerlof/SweWiC/swewic_documentation_sheet.txt) ### SweWsc [dataset documentation](https://demo.spraakbanken.gu.se/gerlof/SweWinograd/swewinograd_documentation_sheet.txt)
szymonrucinski/truthful_qa_pl
--- configs: - config_name: generation data_files: - split: validation path: generation/validation-* - config_name: multiple_choice_1 data_files: - split: validation path: multiple_choice_1/validation-* dataset_info: - config_name: default features: - name: index dtype: int64 - name: question dtype: string - name: best_answer dtype: string - name: correct_answers sequence: string - name: incorrect_answers sequence: string - name: type dtype: string - name: category dtype: string - name: source dtype: string splits: - name: validation num_bytes: 529669 num_examples: 817 download_size: 262081 dataset_size: 529669 - config_name: generation features: - name: question dtype: string - name: best_answer dtype: string - name: correct_answers sequence: string - name: incorrect_answers sequence: string - name: type dtype: string - name: category dtype: string - name: source dtype: string splits: - name: validation num_bytes: 523133 num_examples: 817 download_size: 257197 dataset_size: 523133 - config_name: mc1 features: - name: question dtype: string - name: choices dtype: string - name: labels dtype: string splits: - name: train num_bytes: 302759 num_examples: 817 download_size: 145186 dataset_size: 302759 - config_name: multiple_choice_1 features: - name: question dtype: string - name: choices dtype: string - name: labels dtype: string splits: - name: validation num_bytes: 302759 num_examples: 817 download_size: 145186 dataset_size: 302759 --- # Dataset Card for "truthful_qa_pl" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
deadbits/vigil-instruction-bypass-ada-002
--- tags: - embeddings - text - security pretty_name: 'Vigil: LLM Instruction Bypass text-embedding-ada-002 ' --- # Vigil: LLM Instruction Bypass all-MiniLM-L6-v2 - **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm) `Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs. This repository contains `text-embedding-ada-002` embeddings for all Instruction Bypass style prompts ("Ignore instructions ...") used by [Vigil](https://github.com/deadbits/prompt-injection-defense). You can use the [parquet2vdb.py](https://github.com/deadbits/prompt-injection-defense/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application. ## Format ```json [ { "text": str, "embedding": [], "model": "text-embedding-ada-002" } ] ``` Instruction bypass prompts generated with: https://gist.github.com/deadbits/e93a90aa36c9aa7b5ce1179597a6fe3d#file-generate-phrases-py
Weslley07/Ryder
--- license: openrail ---
open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b
--- pretty_name: Evaluation run of PocketDoc/Dans-AdventurousWinds-Mk2-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PocketDoc/Dans-AdventurousWinds-Mk2-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-11-13T15:52:43.892204](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public/blob/main/results_2023-11-13T15-52-43.892204.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6117457883588181,\n\ \ \"acc_stderr\": 0.03285127869008788,\n \"acc_norm\": 0.621056172344861,\n\ \ \"acc_norm_stderr\": 0.033574977794886766,\n \"mc1\": 0.28886168910648713,\n\ \ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43563008850906,\n\ \ \"mc2_stderr\": 0.014459760341061523,\n \"em\": 0.0018875838926174498,\n\ \ \"em_stderr\": 0.00044451099905589315,\n \"f1\": 0.06191904362416096,\n\ \ \"f1_stderr\": 0.0014055022875998687\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.014568245550296354,\n\ \ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996077\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6399123680541725,\n\ \ \"acc_stderr\": 0.004790445139186366,\n \"acc_norm\": 0.8347938657637921,\n\ \ \"acc_norm_stderr\": 0.003706075184380282\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \ \ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\ \ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\ \ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\ \ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\ \ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\ acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\ \ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\ \ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\ \ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\ acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\ \ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787575,\n \"\ acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787575\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\ acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\ acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \ \ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\ \ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\ \ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\ \ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\ \ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\ \ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\ \ \"acc_stderr\": 0.014583812465862541,\n \"acc_norm\": 0.789272030651341,\n\ \ \"acc_norm_stderr\": 0.014583812465862541\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\ \ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\ \ \"acc_stderr\": 0.015652542496421118,\n \"acc_norm\": 0.3240223463687151,\n\ \ \"acc_norm_stderr\": 0.015652542496421118\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\ \ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\ \ \"acc_stderr\": 0.026664410886937613,\n \"acc_norm\": 0.6720257234726688,\n\ \ \"acc_norm_stderr\": 0.026664410886937613\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\ \ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \ \ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4028683181225554,\n\ \ \"acc_stderr\": 0.012526955577118016,\n \"acc_norm\": 0.4028683181225554,\n\ \ \"acc_norm_stderr\": 0.012526955577118016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954854,\n \ \ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954854\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\ \ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\ \ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\ \ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\ \ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\ \ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.43563008850906,\n\ \ \"mc2_stderr\": 0.014459760341061523\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207394\n\ \ },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \ \ \"em_stderr\": 0.00044451099905589315,\n \"f1\": 0.06191904362416096,\n\ \ \"f1_stderr\": 0.0014055022875998687\n },\n \"harness|gsm8k|5\":\ \ {\n \"acc\": 0.14935557240333586,\n \"acc_stderr\": 0.009818090723727293\n\ \ }\n}\n```" repo_url: https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|arc:challenge|25_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-11-13T15-52-43.892204.parquet' - config_name: harness_drop_3 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|drop|3_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|drop|3_2023-11-13T15-52-43.892204.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|gsm8k|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hellaswag|10_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-52-43.892204.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|truthfulqa:mc|0_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-11-13T15-52-43.892204.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_11_13T15_52_43.892204 path: - '**/details_harness|winogrande|5_2023-11-13T15-52-43.892204.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-11-13T15-52-43.892204.parquet' - config_name: results data_files: - split: 2023_11_13T15_52_43.892204 path: - results_2023-11-13T15-52-43.892204.parquet - split: latest path: - results_2023-11-13T15-52-43.892204.parquet --- # Dataset Card for Evaluation run of PocketDoc/Dans-AdventurousWinds-Mk2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PocketDoc/Dans-AdventurousWinds-Mk2-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-Mk2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-13T15:52:43.892204](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-Mk2-7b_public/blob/main/results_2023-11-13T15-52-43.892204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6117457883588181, "acc_stderr": 0.03285127869008788, "acc_norm": 0.621056172344861, "acc_norm_stderr": 0.033574977794886766, "mc1": 0.28886168910648713, "mc1_stderr": 0.01586634640138431, "mc2": 0.43563008850906, "mc2_stderr": 0.014459760341061523, "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589315, "f1": 0.06191904362416096, "f1_stderr": 0.0014055022875998687 }, "harness|arc:challenge|25": { "acc": 0.53839590443686, "acc_stderr": 0.014568245550296354, "acc_norm": 0.5819112627986348, "acc_norm_stderr": 0.014413988396996077 }, "harness|hellaswag|10": { "acc": 0.6399123680541725, "acc_stderr": 0.004790445139186366, "acc_norm": 0.8347938657637921, "acc_norm_stderr": 0.003706075184380282 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353227, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353227 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.02890159361241178, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.02890159361241178 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.0470070803355104, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.0470070803355104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4896551724137931, "acc_stderr": 0.041657747757287644, "acc_norm": 0.4896551724137931, "acc_norm_stderr": 0.041657747757287644 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895525, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.024639789097709443, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.024639789097709443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633507, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633507 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683515, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683515 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7981651376146789, "acc_stderr": 0.017208579357787575, "acc_norm": 0.7981651376146789, "acc_norm_stderr": 0.017208579357787575 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5787037037037037, "acc_stderr": 0.03367462138896078, "acc_norm": 0.5787037037037037, "acc_norm_stderr": 0.03367462138896078 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967407, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967407 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.0283046579430353, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.0283046579430353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6636771300448431, "acc_stderr": 0.031708824268455005, "acc_norm": 0.6636771300448431, "acc_norm_stderr": 0.031708824268455005 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728742, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728742 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302871, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650742, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650742 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.02336505149175372, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.02336505149175372 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.789272030651341, "acc_stderr": 0.014583812465862541, "acc_norm": 0.789272030651341, "acc_norm_stderr": 0.014583812465862541 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6445086705202312, "acc_stderr": 0.025770292082977254, "acc_norm": 0.6445086705202312, "acc_norm_stderr": 0.025770292082977254 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3240223463687151, "acc_stderr": 0.015652542496421118, "acc_norm": 0.3240223463687151, "acc_norm_stderr": 0.015652542496421118 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6720257234726688, "acc_stderr": 0.026664410886937613, "acc_norm": 0.6720257234726688, "acc_norm_stderr": 0.026664410886937613 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6851851851851852, "acc_stderr": 0.025842248700902168, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.025842248700902168 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.44680851063829785, "acc_stderr": 0.029658235097666907, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.029658235097666907 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4028683181225554, "acc_stderr": 0.012526955577118016, "acc_norm": 0.4028683181225554, "acc_norm_stderr": 0.012526955577118016 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6323529411764706, "acc_stderr": 0.019506291693954854, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.019506291693954854 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6489795918367347, "acc_stderr": 0.03055531675557364, "acc_norm": 0.6489795918367347, "acc_norm_stderr": 0.03055531675557364 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.28886168910648713, "mc1_stderr": 0.01586634640138431, "mc2": 0.43563008850906, "mc2_stderr": 0.014459760341061523 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207394 }, "harness|drop|3": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589315, "f1": 0.06191904362416096, "f1_stderr": 0.0014055022875998687 }, "harness|gsm8k|5": { "acc": 0.14935557240333586, "acc_stderr": 0.009818090723727293 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
kakashi38746/modelodevoz
--- license: openrail ---
neuralleap/Mistral_Test_02
--- license: apache-2.0 ---
trixdade/reviews_russian
--- task_categories: - summarization language: - ru size_categories: - n<1K ---
dim/pseudolab_medsi
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 319493 num_examples: 216 download_size: 201509 dataset_size: 319493 --- # Dataset Card for "pseudolab_medsi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Gille__StrangeMerges_45-7B-dare_ties
--- pretty_name: Evaluation run of Gille/StrangeMerges_45-7B-dare_ties dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Gille/StrangeMerges_45-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_45-7B-dare_ties)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_45-7B-dare_ties\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-27T17:43:51.074177](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_45-7B-dare_ties/blob/main/results_2024-03-27T17-43-51.074177.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557159868760688,\n\ \ \"acc_stderr\": 0.031975236282732696,\n \"acc_norm\": 0.6552299252385686,\n\ \ \"acc_norm_stderr\": 0.03264213021862388,\n \"mc1\": 0.5128518971848225,\n\ \ \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.677933637972909,\n\ \ \"mc2_stderr\": 0.01472579442083734\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n\ \ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6906990639314877,\n\ \ \"acc_stderr\": 0.0046126082066704115,\n \"acc_norm\": 0.8760207130053774,\n\ \ \"acc_norm_stderr\": 0.0032888439778712584\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\ \ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\ \ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\ \ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\ \ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\ acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\ acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\ \ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \ \ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \ \ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033053,\n \"\ acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033053\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250444,\n \"\ acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250444\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \ \ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\ \ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\ \ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\ \ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\ \ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\ \ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\ \ \"acc_stderr\": 0.01653117099327888,\n \"acc_norm\": 0.4245810055865922,\n\ \ \"acc_norm_stderr\": 0.01653117099327888\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\ \ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\ \ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\ \ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\ \ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\ \ \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n\ \ \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \ \ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\ \ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n\ \ \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.677933637972909,\n\ \ \"mc2_stderr\": 0.01472579442083734\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918753\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7278241091736164,\n \ \ \"acc_stderr\": 0.01225971403516454\n }\n}\n```" repo_url: https://huggingface.co/Gille/StrangeMerges_45-7B-dare_ties leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|arc:challenge|25_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-27T17-43-51.074177.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|gsm8k|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hellaswag|10_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-43-51.074177.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-management|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-43-51.074177.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|truthfulqa:mc|0_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-27T17-43-51.074177.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_27T17_43_51.074177 path: - '**/details_harness|winogrande|5_2024-03-27T17-43-51.074177.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-27T17-43-51.074177.parquet' - config_name: results data_files: - split: 2024_03_27T17_43_51.074177 path: - results_2024-03-27T17-43-51.074177.parquet - split: latest path: - results_2024-03-27T17-43-51.074177.parquet --- # Dataset Card for Evaluation run of Gille/StrangeMerges_45-7B-dare_ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_45-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_45-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_45-7B-dare_ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-27T17:43:51.074177](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_45-7B-dare_ties/blob/main/results_2024-03-27T17-43-51.074177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6557159868760688, "acc_stderr": 0.031975236282732696, "acc_norm": 0.6552299252385686, "acc_norm_stderr": 0.03264213021862388, "mc1": 0.5128518971848225, "mc1_stderr": 0.017497717944299822, "mc2": 0.677933637972909, "mc2_stderr": 0.01472579442083734 }, "harness|arc:challenge|25": { "acc": 0.6723549488054608, "acc_stderr": 0.01371584794071934, "acc_norm": 0.6979522184300341, "acc_norm_stderr": 0.013417519144716413 }, "harness|hellaswag|10": { "acc": 0.6906990639314877, "acc_stderr": 0.0046126082066704115, "acc_norm": 0.8760207130053774, "acc_norm_stderr": 0.0032888439778712584 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337142, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337142 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.04959859966384181, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723292, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723292 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188703, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188703 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8605504587155963, "acc_stderr": 0.014852421490033053, "acc_norm": 0.8605504587155963, "acc_norm_stderr": 0.014852421490033053 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250444, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250444 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8227848101265823, "acc_stderr": 0.02485636418450322, "acc_norm": 0.8227848101265823, "acc_norm_stderr": 0.02485636418450322 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608306, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608306 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044283, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4245810055865922, "acc_stderr": 0.01653117099327888, "acc_norm": 0.4245810055865922, "acc_norm_stderr": 0.01653117099327888 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.025403832978179615, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.025403832978179615 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.02389187954195961, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.02389187954195961 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4758800521512386, "acc_stderr": 0.012755368722863935, "acc_norm": 0.4758800521512386, "acc_norm_stderr": 0.012755368722863935 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.01890101532209309, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.01890101532209309 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644286, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644286 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.027539122889061456, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.027539122889061456 }, "harness|truthfulqa:mc|0": { "mc1": 0.5128518971848225, "mc1_stderr": 0.017497717944299822, "mc2": 0.677933637972909, "mc2_stderr": 0.01472579442083734 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918753 }, "harness|gsm8k|5": { "acc": 0.7278241091736164, "acc_stderr": 0.01225971403516454 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
liuyanchen1015/MULTI_VALUE_qqp_negative_concord
--- dataset_info: features: - name: question1 dtype: string - name: question2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 263412 num_examples: 1287 - name: test num_bytes: 2533925 num_examples: 12560 - name: train num_bytes: 2342158 num_examples: 11233 download_size: 3216285 dataset_size: 5139495 --- # Dataset Card for "MULTI_VALUE_qqp_negative_concord" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_sst2_double_superlative
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev num_bytes: 5921 num_examples: 37 - name: test num_bytes: 11410 num_examples: 78 - name: train num_bytes: 173094 num_examples: 1695 download_size: 87021 dataset_size: 190425 --- # Dataset Card for "MULTI_VALUE_sst2_double_superlative" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/frieren_sousounofrieren
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Frieren/フリーレン (Sousou No Frieren) This is the dataset of Frieren/フリーレン (Sousou No Frieren), containing 980 images and their tags. The core tags of this character are `pointy_ears, long_hair, white_hair, twintails, earrings, green_eyes, parted_bangs, grey_hair, dangle_earrings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 980 | 651.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frieren_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 980 | 651.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/frieren_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1917 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/frieren_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/frieren_sousounofrieren', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, closed_mouth, elf, jewelry, portrait, solo, looking_at_viewer, thick_eyebrows, expressionless, anime_coloring, blurry_background, close-up, outdoors | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, closed_mouth, elf, jewelry, solo, looking_at_viewer, portrait, expressionless, thick_eyebrows | | 2 | 23 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, closed_mouth, elf, jewelry, solo, striped_shirt, upper_body, white_capelet, expressionless, striped_clothes, looking_at_viewer, outdoors | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, elf, jewelry, open_mouth, solo, striped_clothes, striped_shirt, upper_body, white_capelet, looking_at_viewer | | 4 | 18 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, elf, forest, jewelry, outdoors, striped_clothes, striped_shirt, tree, white_capelet, solo, closed_mouth, upper_body, looking_at_viewer, black_belt, expressionless | | 5 | 14 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_belt, elf, jewelry, striped_clothes, striped_shirt, white_capelet, closed_mouth, long_sleeves, white_skirt, solo, outdoors, looking_at_viewer, expressionless | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, elf, jewelry, profile, solo, white_capelet, closed_mouth, from_side, upper_body | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_belt, elf, holding_staff, jewelry, long_sleeves, striped_shirt, white_capelet, closed_mouth, outdoors, solo, striped_clothes, forest, mage_staff, tree | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, black_pantyhose, elf, gold_trim, jewelry, long_sleeves, outdoors, solo, striped_clothes, striped_shirt, tree, white_capelet, standing, white_skirt, forest, black_belt, closed_mouth, boots, brown_footwear | | 9 | 11 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, elf, jewelry, outdoors, profile, solo, tree, closed_mouth, forest, from_side, white_capelet, upper_body | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, black_belt, black_pantyhose, elf, holding_staff, jewelry, long_sleeves, outdoors, sky, solo, standing, striped_clothes, striped_shirt, white_capelet, boots, brown_footwear, closed_mouth, cloud, tree, white_skirt, looking_at_viewer | | 11 | 6 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, elf, forest, jewelry, open_mouth, outdoors, solo, tree, white_capelet, grass, looking_at_viewer | | 12 | 8 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, closed_mouth, elf, outdoors, solo, forest, jewelry, looking_at_viewer, tree, expressionless, portrait, blurry_background | | 13 | 7 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | 1girl, elf, jewelry, looking_at_viewer, solo, open_mouth, thick_eyebrows, meme | | 14 | 17 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | 1girl, blue_scarf, elf, jewelry, solo, upper_body, closed_mouth, outdoors, white_coat, winter_clothes | | 15 | 12 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | 1girl, elf, jewelry, night, outdoors, solo, forest, long_sleeves, tree, blue_scarf, white_coat, green_scarf, closed_mouth, lantern, black_pantyhose, thick_eyebrows, holding, squatting | | 16 | 8 | ![](samples/16/clu16-sample0.png) | ![](samples/16/clu16-sample1.png) | ![](samples/16/clu16-sample2.png) | ![](samples/16/clu16-sample3.png) | ![](samples/16/clu16-sample4.png) | 1girl, closed_mouth, elf, from_side, jewelry, profile, solo_focus, white_capelet, multiple_boys, blurry, dwarf, long_sleeves, upper_body | | 17 | 6 | ![](samples/17/clu17-sample0.png) | ![](samples/17/clu17-sample1.png) | ![](samples/17/clu17-sample2.png) | ![](samples/17/clu17-sample3.png) | ![](samples/17/clu17-sample4.png) | 1girl, closed_mouth, elf, outdoors, solo, upper_body, white_dress, collarbone, forest, tree, hair_down, expressionless, from_side, looking_at_viewer, profile, sleeveless_dress | | 18 | 5 | ![](samples/18/clu18-sample0.png) | ![](samples/18/clu18-sample1.png) | ![](samples/18/clu18-sample2.png) | ![](samples/18/clu18-sample3.png) | ![](samples/18/clu18-sample4.png) | 1boy, 1girl, dwarf, elf, horned_helmet, jewelry, long_beard, outdoors, armor, closed_mouth, striped_shirt, looking_at_viewer, striped_clothes, thick_eyebrows, upper_body, white_capelet | | 19 | 6 | ![](samples/19/clu19-sample0.png) | ![](samples/19/clu19-sample1.png) | ![](samples/19/clu19-sample2.png) | ![](samples/19/clu19-sample3.png) | ![](samples/19/clu19-sample4.png) | 2girls, boots, elf, black_pantyhose, blue_scarf, brown_footwear, coat, jewelry, long_sleeves, sitting, winter_clothes, outdoors, solo_focus | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | elf | jewelry | portrait | solo | looking_at_viewer | thick_eyebrows | expressionless | anime_coloring | blurry_background | close-up | outdoors | striped_shirt | upper_body | white_capelet | striped_clothes | open_mouth | forest | tree | black_belt | long_sleeves | white_skirt | profile | from_side | holding_staff | mage_staff | black_pantyhose | gold_trim | standing | boots | brown_footwear | sky | cloud | grass | meme | blue_scarf | white_coat | winter_clothes | night | green_scarf | lantern | holding | squatting | solo_focus | multiple_boys | blurry | dwarf | white_dress | collarbone | hair_down | sleeveless_dress | 1boy | horned_helmet | long_beard | armor | 2girls | coat | sitting | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:------|:----------|:-----------|:-------|:--------------------|:-----------------|:-----------------|:-----------------|:--------------------|:-----------|:-----------|:----------------|:-------------|:----------------|:------------------|:-------------|:---------|:-------|:-------------|:---------------|:--------------|:----------|:------------|:----------------|:-------------|:------------------|:------------|:-----------|:--------|:-----------------|:------|:--------|:--------|:-------|:-------------|:-------------|:-----------------|:--------|:--------------|:----------|:----------|:------------|:-------------|:----------------|:---------|:--------|:--------------|:-------------|:------------|:-------------------|:-------|:----------------|:-------------|:--------|:---------|:-------|:----------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 23 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | X | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | X | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 18 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | X | X | | X | | | | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 14 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | | X | X | | X | | | | X | X | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | X | | | | | | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | | X | | | | | | | X | X | | X | X | | X | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | X | | X | | | | | | | X | X | | X | X | | X | X | X | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 11 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | | X | | | | | | | X | | X | X | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | X | X | | X | X | | | | | | X | X | | X | X | | | X | X | X | X | | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 11 | 6 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | X | | X | X | | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | 12 | 8 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | X | X | X | X | X | X | | X | | X | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 13 | 7 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | X | | X | X | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | 14 | 17 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | X | X | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | 15 | 12 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | X | X | X | X | | X | | X | | | | | X | | | | | | X | X | | X | | | | | | X | | | | | | | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | 16 | 8 | ![](samples/16/clu16-sample0.png) | ![](samples/16/clu16-sample1.png) | ![](samples/16/clu16-sample2.png) | ![](samples/16/clu16-sample3.png) | ![](samples/16/clu16-sample4.png) | X | X | X | X | | | | | | | | | | | X | X | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | 17 | 6 | ![](samples/17/clu17-sample0.png) | ![](samples/17/clu17-sample1.png) | ![](samples/17/clu17-sample2.png) | ![](samples/17/clu17-sample3.png) | ![](samples/17/clu17-sample4.png) | X | X | X | | | X | X | | X | | | | X | | X | | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | 18 | 5 | ![](samples/18/clu18-sample0.png) | ![](samples/18/clu18-sample1.png) | ![](samples/18/clu18-sample2.png) | ![](samples/18/clu18-sample3.png) | ![](samples/18/clu18-sample4.png) | X | X | X | X | | | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | | | | | 19 | 6 | ![](samples/19/clu19-sample0.png) | ![](samples/19/clu19-sample1.png) | ![](samples/19/clu19-sample2.png) | ![](samples/19/clu19-sample3.png) | ![](samples/19/clu19-sample4.png) | | | X | X | | | | | | | | | X | | | | | | | | | X | | | | | | X | | | X | X | | | | | X | | X | | | | | | X | | | | | | | | | | | | X | X | X |
Lollitor/CASFMarked
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: ID dtype: string - name: INPUT dtype: string splits: - name: train num_bytes: 294982 num_examples: 285 download_size: 120329 dataset_size: 294982 --- # Dataset Card for "CASFMarked" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jlbaker361/flickr_humans_dim_128_0.5k_vangogh
--- dataset_info: features: - name: image dtype: image - name: split dtype: string - name: style dtype: string splits: - name: train num_bytes: 17342368.0 num_examples: 500 download_size: 17336424 dataset_size: 17342368.0 --- # Dataset Card for "flickr_humans_dim_128_0.5k_vangogh" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_rte_drop_aux_be_gonna
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 4587 num_examples: 8 - name: train num_bytes: 4110 num_examples: 6 download_size: 18134 dataset_size: 8697 --- # Dataset Card for "MULTI_VALUE_rte_drop_aux_be_gonna" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zion84006/tencentdata_speech_tokenizer
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* - split: test path: data/test-* dataset_info: features: - name: file_id dtype: int64 - name: wav_id dtype: int64 - name: instruction dtype: string - name: transcription dtype: string - name: src_speech_tokenizer_0 sequence: int64 - name: src_speech_tokenizer_1 sequence: int64 - name: src_speech_tokenizer_2 sequence: int64 - name: src_speech_tokenizer_3 sequence: int64 - name: src_speech_tokenizer_4 sequence: int64 - name: src_speech_tokenizer_5 sequence: int64 - name: src_speech_tokenizer_6 sequence: int64 - name: src_speech_tokenizer_7 sequence: int64 - name: tgt_speech_tokenizer_0 sequence: int64 - name: tgt_speech_tokenizer_1 sequence: int64 - name: tgt_speech_tokenizer_2 sequence: int64 - name: tgt_speech_tokenizer_3 sequence: int64 - name: tgt_speech_tokenizer_4 sequence: int64 - name: tgt_speech_tokenizer_5 sequence: int64 - name: tgt_speech_tokenizer_6 sequence: int64 - name: tgt_speech_tokenizer_7 sequence: int64 splits: - name: train num_bytes: 12406092460 num_examples: 266780 - name: valid num_bytes: 352367844 num_examples: 7620 - name: test num_bytes: 339389388 num_examples: 7620 download_size: 708155490 dataset_size: 13097849692 --- # Dataset Card for "tencentdata_speech_tokenizer" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Kai1014/facemask-kaggle
--- language: - en license: - odbl pretty_name: Face Mask Detection size_categories: - 1K<n<10K source_datasets: - original task_categories: - image-classification --- ## Dataset Description - **Homepage:** [Face Mask Detection Dataset](https://www.kaggle.com/datasets/vijaykumar1799/face-mask-detection) - **Repository:** N/A - **Paper:** N/A - **Leaderboard:** N/A - **Point of Contact:** N/A ## Dataset Summary A dataset from [kaggle](https://www.kaggle.com/datasets/vijaykumar1799/face-mask-detection). origin: https://dphi.tech/challenges/data-sprint-76-human-activity-recognition/233/data ### Introduction - ### PROBLEM STATEMENT - ### About Files - Train - contains all the images that are to be used for training your model. In this folder you will find 15 folders namely - 'calling', ’clapping’, ’cycling’, ’dancing’, ‘drinking’, ‘eating’, ‘fighting’, ‘hugging’, ‘laughing’, ‘listeningtomusic’, ‘running’, ‘sitting’, ‘sleeping’, texting’, ‘using_laptop’ which contain the images of the respective human activities. - Test - contains 5400 images of Human Activities. For these images you are required to make predictions as the respective class names -'calling', ’clapping’, ’cycling’, ’dancing’, ‘drinking’, ‘eating’, ‘fighting’, ‘hugging’, ‘laughing’, ‘listeningtomusic’, ‘running’, ‘sitting’, ‘sleeping’, texting’, ‘using_laptop’. - Testing_set.csv - this is the order of the predictions for each image that is to be submitted on the platform. Make sure the predictions you download are with their image’s filename in the same order as given in this file. - sample_submission: This is a csv file that contains the sample submission for the data sprint. ### Data Fields The data instances have the following fields: - `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`. - `labels`: an `int` classification label. All `test` data is labeled 0. ### Class Label Mappings: ``` { 'mask_weared_incorrect': 0, 'with_mask': 1, 'without_mask': 2 } ``` ### Data Splits | | train | test | validation| |---------------|--------|------|----------:| | # of examples | 1500 | 180 | 180 ### Data Size - download: 46 MiB - generated: 46.8 MiB - total: 92.8 MiB ```pycon >>> from datasets import load_dataset >>> ds = load_dataset("poolrf2001/mask") >>> ds DatasetDict({ test: Dataset({ features: ['image', 'labels'], num_rows: 180 }) train: Dataset({ features: ['image', 'labels'], num_rows: 1500 }) validation: Dataset({ features: ['image', 'labels'], num_rows: 180 }) }) >>> ds["train"].features {'image': Image(decode=True, id=None), 'labels': ClassLabel(num_classes=3, names=['mask_weared_incorrect', 'with_mask', 'without_mask'], id=None)} >>> ds["train"][0] {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=180x180>, 'labels': 1} ```
yuvalkirstain/pick_a_pic_preferred_images_first_day
--- dataset_info: features: - name: image_id dtype: int64 - name: created_at dtype: timestamp[ns] - name: image_uid dtype: string - name: user_id dtype: int64 - name: prompt dtype: string - name: negative_prompt dtype: string - name: seed dtype: int64 - name: gs dtype: float64 - name: steps dtype: int64 - name: idx dtype: int64 - name: num_generated dtype: int64 - name: scheduler_cls dtype: string - name: model_id dtype: string - name: url dtype: string - name: image dtype: image splits: - name: train num_bytes: 686322947.851 num_examples: 1001 download_size: 685855336 dataset_size: 686322947.851 --- # Dataset Card for "pick_a_pic_preferred_images_first_day" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
japanese-asr/whisper_transcriptions.reazonspeech.all_23
--- dataset_info: config_name: all features: - name: name dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: transcription dtype: string - name: whisper_transcript sequence: int64 splits: - name: train num_bytes: 30358131493.0 num_examples: 267162 download_size: 30116010429 dataset_size: 30358131493.0 configs: - config_name: all data_files: - split: train path: all/train-* ---
ibranze/araproje_hellaswag_tr_conf2
--- dataset_info: features: - name: ind dtype: int32 - name: activity_label dtype: string - name: ctx_a dtype: string - name: ctx_b dtype: string - name: ctx dtype: string - name: endings sequence: string - name: source_id dtype: string - name: split dtype: string - name: split_type dtype: string - name: label dtype: string splits: - name: validation num_bytes: 162703.0 num_examples: 250 download_size: 86220 dataset_size: 162703.0 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "araproje_hellaswag_tr_conf2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mattlc/multilingual-TEDX-fr-duration
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* dataset_info: features: - name: file dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: sentence dtype: string - name: speaker_id dtype: string - name: start_timestamp dtype: float32 - name: end_timestamp dtype: float32 - name: index dtype: int32 - name: duration dtype: float64 - name: text dtype: string splits: - name: train num_bytes: 20290217368.375 num_examples: 116045 - name: test num_bytes: 179302302.625 num_examples: 1059 - name: validation num_bytes: 179302302.625 num_examples: 1059 download_size: 20376737131 dataset_size: 20648821973.625 --- # Dataset Card for "multilingual-TEDX-fr-duration" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BitBasher/mini-ibased-dataset
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string splits: - name: train num_bytes: 53310 num_examples: 55 download_size: 28265 dataset_size: 53310 configs: - config_name: default data_files: - split: train path: data/train-* ---
SouBryan/Cod_MW2019_Precision_Airstrike_Dataset
--- license: mit ---
AIGym/gpt-data-pile
--- language: - en dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1103884170 num_examples: 305721 download_size: 599491172 dataset_size: 1103884170 configs: - config_name: default data_files: - split: train path: data/train-* ---
modelloosrvcc/carrodoovo
--- license: openrail ---
open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2
--- pretty_name: Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-04T11:57:24.228849](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2/blob/main/results_2023-12-04T11-57-24.228849.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5470210161245963,\n\ \ \"acc_stderr\": 0.033586335697642675,\n \"acc_norm\": 0.5564143725807108,\n\ \ \"acc_norm_stderr\": 0.03444006583011199,\n \"mc1\": 0.3769889840881273,\n\ \ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5299552830341843,\n\ \ \"mc2_stderr\": 0.01569290592260198\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\ \ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946707\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n\ \ \"acc_stderr\": 0.004759103432380757,\n \"acc_norm\": 0.8401712806213901,\n\ \ \"acc_norm_stderr\": 0.0036569821653861826\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n\ \ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\ \ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n\ \ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\ \ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\ \ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101813,\n \"\ acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101813\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\ \ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\ \ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n\ \ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\ \ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\ \ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\ \ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\ \ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\ : {\n \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n\ \ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"\ acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\ acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n\ \ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\ \ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\ \ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\ \ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\ \ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\ \ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\ \ \"acc_stderr\": 0.015543377313719683,\n \"acc_norm\": 0.7471264367816092,\n\ \ \"acc_norm_stderr\": 0.015543377313719683\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\ \ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34301675977653634,\n\ \ \"acc_stderr\": 0.015876912673057738,\n \"acc_norm\": 0.34301675977653634,\n\ \ \"acc_norm_stderr\": 0.015876912673057738\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\ \ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\ \ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\ \ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132146,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132146\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \ \ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\ \ \"acc_stderr\": 0.012643004623790203,\n \"acc_norm\": 0.42959582790091266,\n\ \ \"acc_norm_stderr\": 0.012643004623790203\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \ \ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\ \ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\ \ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\ \ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\ \ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\ \ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\ \ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5299552830341843,\n\ \ \"mc2_stderr\": 0.01569290592260198\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \ \ \"acc_stderr\": 0.003282055917136976\n }\n}\n```" repo_url: https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|arc:challenge|25_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-04T11-57-24.228849.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|gsm8k|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hellaswag|10_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|truthfulqa:mc|0_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-04T11-57-24.228849.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_04T11_57_24.228849 path: - '**/details_harness|winogrande|5_2023-12-04T11-57-24.228849.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-04T11-57-24.228849.parquet' - config_name: results data_files: - split: 2023_12_04T11_57_24.228849 path: - results_2023-12-04T11-57-24.228849.parquet - split: latest path: - results_2023-12-04T11-57-24.228849.parquet --- # Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:57:24.228849](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2/blob/main/results_2023-12-04T11-57-24.228849.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5470210161245963, "acc_stderr": 0.033586335697642675, "acc_norm": 0.5564143725807108, "acc_norm_stderr": 0.03444006583011199, "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5299552830341843, "mc2_stderr": 0.01569290592260198 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650649, "acc_norm": 0.6006825938566553, "acc_norm_stderr": 0.014312094557946707 }, "harness|hellaswag|10": { "acc": 0.6502688707428799, "acc_stderr": 0.004759103432380757, "acc_norm": 0.8401712806213901, "acc_norm_stderr": 0.0036569821653861826 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.040260970832965634, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.040260970832965634 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.03261936918467382, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.35964912280701755, "acc_stderr": 0.045144961328736334, "acc_norm": 0.35964912280701755, "acc_norm_stderr": 0.045144961328736334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30158730158730157, "acc_stderr": 0.023636975996101813, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.023636975996101813 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147125, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147125 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6419354838709678, "acc_stderr": 0.027273890594300645, "acc_norm": 0.6419354838709678, "acc_norm_stderr": 0.027273890594300645 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4236453201970443, "acc_stderr": 0.034767257476490364, "acc_norm": 0.4236453201970443, "acc_norm_stderr": 0.034767257476490364 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.032424979581788166, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.03003114797764154, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5051282051282051, "acc_stderr": 0.025349672906838653, "acc_norm": 0.5051282051282051, "acc_norm_stderr": 0.025349672906838653 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.0279404571362284, "acc_norm": 0.3, "acc_norm_stderr": 0.0279404571362284 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5546218487394958, "acc_stderr": 0.032284106267163895, "acc_norm": 0.5546218487394958, "acc_norm_stderr": 0.032284106267163895 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7302752293577982, "acc_stderr": 0.019028486711115438, "acc_norm": 0.7302752293577982, "acc_norm_stderr": 0.019028486711115438 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.0283046579430353, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.0283046579430353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776678, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776678 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.042438692422305246, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.042438692422305246 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.656441717791411, "acc_stderr": 0.037311335196738925, "acc_norm": 0.656441717791411, "acc_norm_stderr": 0.037311335196738925 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.02514093595033544, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.02514093595033544 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7471264367816092, "acc_stderr": 0.015543377313719683, "acc_norm": 0.7471264367816092, "acc_norm_stderr": 0.015543377313719683 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.026074314851657083, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.026074314851657083 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34301675977653634, "acc_stderr": 0.015876912673057738, "acc_norm": 0.34301675977653634, "acc_norm_stderr": 0.015876912673057738 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6209150326797386, "acc_stderr": 0.027780141207023344, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.027780141207023344 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.617363344051447, "acc_stderr": 0.027604689028581986, "acc_norm": 0.617363344051447, "acc_norm_stderr": 0.027604689028581986 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132146, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.02927553215970473, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.02927553215970473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42959582790091266, "acc_stderr": 0.012643004623790203, "acc_norm": 0.42959582790091266, "acc_norm_stderr": 0.012643004623790203 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5637254901960784, "acc_stderr": 0.02006287424353913, "acc_norm": 0.5637254901960784, "acc_norm_stderr": 0.02006287424353913 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6326530612244898, "acc_stderr": 0.03086214492108756, "acc_norm": 0.6326530612244898, "acc_norm_stderr": 0.03086214492108756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.03152439186555401, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.03152439186555401 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.031581495393387324, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.031581495393387324 }, "harness|truthfulqa:mc|0": { "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5299552830341843, "mc2_stderr": 0.01569290592260198 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759987 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.003282055917136976 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
NLPCoreTeam/mmlu_ru
--- pretty_name: MMLU RU/EN language: - ru - en size_categories: - 10K<n<100K task_categories: - question-answering - multiple-choice task_ids: - multiple-choice-qa dataset_info: - config_name: abstract_algebra features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2182 num_examples: 5 - name: val num_bytes: 5220 num_examples: 11 - name: test num_bytes: 50926 num_examples: 100 download_size: 5548198 dataset_size: 58328 - config_name: anatomy features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2482 num_examples: 5 - name: val num_bytes: 8448 num_examples: 14 - name: test num_bytes: 91387 num_examples: 135 download_size: 5548198 dataset_size: 102317 - config_name: astronomy features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6049 num_examples: 5 - name: val num_bytes: 14187 num_examples: 16 - name: test num_bytes: 130167 num_examples: 152 download_size: 5548198 dataset_size: 150403 - config_name: business_ethics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6197 num_examples: 5 - name: val num_bytes: 8963 num_examples: 11 - name: test num_bytes: 96566 num_examples: 100 download_size: 5548198 dataset_size: 111726 - config_name: clinical_knowledge features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3236 num_examples: 5 - name: val num_bytes: 18684 num_examples: 29 - name: test num_bytes: 178043 num_examples: 265 download_size: 5548198 dataset_size: 199963 - config_name: college_biology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4232 num_examples: 5 - name: val num_bytes: 13521 num_examples: 16 - name: test num_bytes: 139322 num_examples: 144 download_size: 5548198 dataset_size: 157075 - config_name: college_chemistry features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3533 num_examples: 5 - name: val num_bytes: 6157 num_examples: 8 - name: test num_bytes: 65540 num_examples: 100 download_size: 5548198 dataset_size: 75230 - config_name: college_computer_science features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 7513 num_examples: 5 - name: val num_bytes: 13341 num_examples: 11 - name: test num_bytes: 120578 num_examples: 100 download_size: 5548198 dataset_size: 141432 - config_name: college_mathematics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3841 num_examples: 5 - name: val num_bytes: 6835 num_examples: 11 - name: test num_bytes: 65110 num_examples: 100 download_size: 5548198 dataset_size: 75786 - config_name: college_medicine features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4659 num_examples: 5 - name: val num_bytes: 22116 num_examples: 22 - name: test num_bytes: 235856 num_examples: 173 download_size: 5548198 dataset_size: 262631 - config_name: college_physics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3740 num_examples: 5 - name: val num_bytes: 9491 num_examples: 11 - name: test num_bytes: 81480 num_examples: 102 download_size: 5548198 dataset_size: 94711 - config_name: computer_security features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3150 num_examples: 5 - name: val num_bytes: 12859 num_examples: 11 - name: test num_bytes: 77969 num_examples: 100 download_size: 5548198 dataset_size: 93978 - config_name: conceptual_physics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2611 num_examples: 5 - name: val num_bytes: 12480 num_examples: 26 - name: test num_bytes: 112243 num_examples: 235 download_size: 5548198 dataset_size: 127334 - config_name: econometrics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4548 num_examples: 5 - name: val num_bytes: 13874 num_examples: 12 - name: test num_bytes: 128633 num_examples: 114 download_size: 5548198 dataset_size: 147055 - config_name: electrical_engineering features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2598 num_examples: 5 - name: val num_bytes: 8003 num_examples: 16 - name: test num_bytes: 70846 num_examples: 145 download_size: 5548198 dataset_size: 81447 - config_name: elementary_mathematics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3760 num_examples: 5 - name: val num_bytes: 23416 num_examples: 41 - name: test num_bytes: 181090 num_examples: 378 download_size: 5548198 dataset_size: 208266 - config_name: formal_logic features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4715 num_examples: 5 - name: val num_bytes: 17099 num_examples: 14 - name: test num_bytes: 133930 num_examples: 126 download_size: 5548198 dataset_size: 155744 - config_name: global_facts features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3450 num_examples: 5 - name: val num_bytes: 4971 num_examples: 10 - name: test num_bytes: 51481 num_examples: 100 download_size: 5548198 dataset_size: 59902 - config_name: high_school_biology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4759 num_examples: 5 - name: val num_bytes: 30807 num_examples: 32 - name: test num_bytes: 310356 num_examples: 310 download_size: 5548198 dataset_size: 345922 - config_name: high_school_chemistry features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3204 num_examples: 5 - name: val num_bytes: 18948 num_examples: 22 - name: test num_bytes: 158246 num_examples: 203 download_size: 5548198 dataset_size: 180398 - config_name: high_school_computer_science features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 7933 num_examples: 5 - name: val num_bytes: 9612 num_examples: 9 - name: test num_bytes: 126403 num_examples: 100 download_size: 5548198 dataset_size: 143948 - config_name: high_school_european_history features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 32447 num_examples: 5 - name: val num_bytes: 83098 num_examples: 18 - name: test num_bytes: 754136 num_examples: 165 download_size: 5548198 dataset_size: 869681 - config_name: high_school_geography features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4131 num_examples: 5 - name: val num_bytes: 12467 num_examples: 22 - name: test num_bytes: 119021 num_examples: 198 download_size: 5548198 dataset_size: 135619 - config_name: high_school_government_and_politics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 5188 num_examples: 5 - name: val num_bytes: 20564 num_examples: 21 - name: test num_bytes: 194050 num_examples: 193 download_size: 5548198 dataset_size: 219802 - config_name: high_school_macroeconomics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3942 num_examples: 5 - name: val num_bytes: 37243 num_examples: 43 - name: test num_bytes: 340699 num_examples: 390 download_size: 5548198 dataset_size: 381884 - config_name: high_school_mathematics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3244 num_examples: 5 - name: val num_bytes: 14758 num_examples: 29 - name: test num_bytes: 140257 num_examples: 270 download_size: 5548198 dataset_size: 158259 - config_name: high_school_microeconomics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3503 num_examples: 5 - name: val num_bytes: 22212 num_examples: 26 - name: test num_bytes: 219097 num_examples: 238 download_size: 5548198 dataset_size: 244812 - config_name: high_school_physics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3905 num_examples: 5 - name: val num_bytes: 18535 num_examples: 17 - name: test num_bytes: 162917 num_examples: 151 download_size: 5548198 dataset_size: 185357 - config_name: high_school_psychology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 5207 num_examples: 5 - name: val num_bytes: 49277 num_examples: 60 - name: test num_bytes: 455603 num_examples: 545 download_size: 5548198 dataset_size: 510087 - config_name: high_school_statistics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6823 num_examples: 5 - name: val num_bytes: 28020 num_examples: 23 - name: test num_bytes: 312578 num_examples: 216 download_size: 5548198 dataset_size: 347421 - config_name: high_school_us_history features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 25578 num_examples: 5 - name: val num_bytes: 91278 num_examples: 22 - name: test num_bytes: 842680 num_examples: 204 download_size: 5548198 dataset_size: 959536 - config_name: high_school_world_history features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 13893 num_examples: 5 - name: val num_bytes: 129121 num_examples: 26 - name: test num_bytes: 1068018 num_examples: 237 download_size: 5548198 dataset_size: 1211032 - config_name: human_aging features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2820 num_examples: 5 - name: val num_bytes: 13442 num_examples: 23 - name: test num_bytes: 132242 num_examples: 223 download_size: 5548198 dataset_size: 148504 - config_name: human_sexuality features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3072 num_examples: 5 - name: val num_bytes: 6699 num_examples: 12 - name: test num_bytes: 90007 num_examples: 131 download_size: 5548198 dataset_size: 99778 - config_name: international_law features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6880 num_examples: 5 - name: val num_bytes: 19166 num_examples: 13 - name: test num_bytes: 157259 num_examples: 121 download_size: 5548198 dataset_size: 183305 - config_name: jurisprudence features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3568 num_examples: 5 - name: val num_bytes: 10638 num_examples: 11 - name: test num_bytes: 97121 num_examples: 108 download_size: 5548198 dataset_size: 111327 - config_name: logical_fallacies features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4526 num_examples: 5 - name: val num_bytes: 14547 num_examples: 18 - name: test num_bytes: 144501 num_examples: 163 download_size: 5548198 dataset_size: 163574 - config_name: machine_learning features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6966 num_examples: 5 - name: val num_bytes: 8986 num_examples: 11 - name: test num_bytes: 95571 num_examples: 112 download_size: 5548198 dataset_size: 111523 - config_name: management features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2427 num_examples: 5 - name: val num_bytes: 5210 num_examples: 11 - name: test num_bytes: 57201 num_examples: 103 download_size: 5548198 dataset_size: 64838 - config_name: marketing features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4514 num_examples: 5 - name: val num_bytes: 20832 num_examples: 25 - name: test num_bytes: 181786 num_examples: 234 download_size: 5548198 dataset_size: 207132 - config_name: medical_genetics features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3226 num_examples: 5 - name: val num_bytes: 8214 num_examples: 11 - name: test num_bytes: 57064 num_examples: 100 download_size: 5548198 dataset_size: 68504 - config_name: miscellaneous features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 1782 num_examples: 5 - name: val num_bytes: 39225 num_examples: 86 - name: test num_bytes: 407209 num_examples: 783 download_size: 5548198 dataset_size: 448216 - config_name: moral_disputes features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4910 num_examples: 5 - name: val num_bytes: 36026 num_examples: 38 - name: test num_bytes: 313611 num_examples: 346 download_size: 5548198 dataset_size: 354547 - config_name: moral_scenarios features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6175 num_examples: 5 - name: val num_bytes: 129062 num_examples: 100 - name: test num_bytes: 1137631 num_examples: 895 download_size: 5548198 dataset_size: 1272868 - config_name: nutrition features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6030 num_examples: 5 - name: val num_bytes: 24210 num_examples: 33 - name: test num_bytes: 266173 num_examples: 306 download_size: 5548198 dataset_size: 296413 - config_name: philosophy features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 2631 num_examples: 5 - name: val num_bytes: 25751 num_examples: 34 - name: test num_bytes: 227086 num_examples: 311 download_size: 5548198 dataset_size: 255468 - config_name: prehistory features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 5394 num_examples: 5 - name: val num_bytes: 28687 num_examples: 35 - name: test num_bytes: 251723 num_examples: 324 download_size: 5548198 dataset_size: 285804 - config_name: professional_accounting features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6277 num_examples: 5 - name: val num_bytes: 40914 num_examples: 31 - name: test num_bytes: 364528 num_examples: 282 download_size: 5548198 dataset_size: 411719 - config_name: professional_law features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 19120 num_examples: 5 - name: val num_bytes: 589307 num_examples: 170 - name: test num_bytes: 5479411 num_examples: 1534 download_size: 5548198 dataset_size: 6087838 - config_name: professional_medicine features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 10901 num_examples: 5 - name: val num_bytes: 69703 num_examples: 31 - name: test num_bytes: 633483 num_examples: 272 download_size: 5548198 dataset_size: 714087 - config_name: professional_psychology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 6430 num_examples: 5 - name: val num_bytes: 82745 num_examples: 69 - name: test num_bytes: 648634 num_examples: 612 download_size: 5548198 dataset_size: 737809 - config_name: public_relations features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4384 num_examples: 5 - name: val num_bytes: 13108 num_examples: 12 - name: test num_bytes: 82403 num_examples: 110 download_size: 5548198 dataset_size: 99895 - config_name: security_studies features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 16064 num_examples: 5 - name: val num_bytes: 67877 num_examples: 27 - name: test num_bytes: 611059 num_examples: 245 download_size: 5548198 dataset_size: 695000 - config_name: sociology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4693 num_examples: 5 - name: val num_bytes: 20654 num_examples: 22 - name: test num_bytes: 191420 num_examples: 201 download_size: 5548198 dataset_size: 216767 - config_name: us_foreign_policy features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 4781 num_examples: 5 - name: val num_bytes: 9171 num_examples: 11 - name: test num_bytes: 81649 num_examples: 100 download_size: 5548198 dataset_size: 95601 - config_name: virology features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 3063 num_examples: 5 - name: val num_bytes: 15618 num_examples: 18 - name: test num_bytes: 111027 num_examples: 166 download_size: 5548198 dataset_size: 129708 - config_name: world_religions features: - name: question_en dtype: string - name: choices_en sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question_ru dtype: string - name: choices_ru sequence: string splits: - name: dev num_bytes: 1691 num_examples: 5 - name: val num_bytes: 7052 num_examples: 19 - name: test num_bytes: 65559 num_examples: 171 download_size: 5548198 dataset_size: 74302 --- # MMLU in Russian (Massive Multitask Language Understanding) ## Overview of the Dataset MMLU dataset for EN/RU, without auxiliary train. The dataset contains `dev`/`val`/`test` splits for both, English and Russian languages. Note it doesn't include `auxiliary_train` split, which wasn't translated. Totally the dataset has ~16k samples per language: 285 `dev`, 1531 `val`, 14042 `test`. ## Description of original MMLU MMLU dataset covers 57 different tasks. Each task requires to choose the right answer out of four options for a given question. Paper "Measuring Massive Multitask Language Understanding": https://arxiv.org/abs/2009.03300v3. It is also known as the "hendrycks_test". ## Dataset Creation The translation was made via Yandex.Translate API. There are some translation mistakes, especially observed with terms and formulas, no fixes were applied. Initial dataset was taken from: https://people.eecs.berkeley.edu/~hendrycks/data.tar. ## Sample example ``` { "question_en": "Why doesn't Venus have seasons like Mars and Earth do?", "choices_en": [ "Its rotation axis is nearly perpendicular to the plane of the Solar System.", "It does not have an ozone layer.", "It does not rotate fast enough.", "It is too close to the Sun." ], "answer": 0, "question_ru": "Почему на Венере нет времен года, как на Марсе и Земле?", "choices_ru": [ "Ось его вращения почти перпендикулярна плоскости Солнечной системы.", "У него нет озонового слоя.", "Он вращается недостаточно быстро.", "Это слишком близко к Солнцу." ] } ``` ## Usage To merge all subsets into dataframe per split: ```python from collections import defaultdict import datasets import pandas as pd subjects = ["abstract_algebra", "anatomy", "astronomy", "business_ethics", "clinical_knowledge", "college_biology", "college_chemistry", "college_computer_science", "college_mathematics", "college_medicine", "college_physics", "computer_security", "conceptual_physics", "econometrics", "electrical_engineering", "elementary_mathematics", "formal_logic", "global_facts", "high_school_biology", "high_school_chemistry", "high_school_computer_science", "high_school_european_history", "high_school_geography", "high_school_government_and_politics", "high_school_macroeconomics", "high_school_mathematics", "high_school_microeconomics", "high_school_physics", "high_school_psychology", "high_school_statistics", "high_school_us_history", "high_school_world_history", "human_aging", "human_sexuality", "international_law", "jurisprudence", "logical_fallacies", "machine_learning", "management", "marketing", "medical_genetics", "miscellaneous", "moral_disputes", "moral_scenarios", "nutrition", "philosophy", "prehistory", "professional_accounting", "professional_law", "professional_medicine", "professional_psychology", "public_relations", "security_studies", "sociology", "us_foreign_policy", "virology", "world_religions"] splits = ["dev", "val", "test"] all_datasets = {x: datasets.load_dataset("NLPCoreTeam/mmlu_ru", name=x) for x in subjects} res = defaultdict(list) for subject in subjects: for split in splits: dataset = all_datasets[subject][split] df = dataset.to_pandas() int2str = dataset.features['answer'].int2str df['answer'] = df['answer'].map(int2str) df.insert(loc=0, column='subject_en', value=subject) res[split].append(df) res = {k: pd.concat(v) for k, v in res.items()} df_dev = res['dev'] df_val = res['val'] df_test = res['test'] ``` ## Evaluation This dataset is intended to evaluate LLMs with few-shot/zero-shot setup. Evaluation code: https://github.com/NLP-Core-Team/mmlu_ru Also resources might be helpful: 1. https://github.com/hendrycks/test 1. https://github.com/openai/evals/blob/main/examples/mmlu.ipynb 1. https://github.com/EleutherAI/lm-evaluation-harness/blob/master/lm_eval/tasks/hendrycks_test.py ## Contributions Dataset added by NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam)
Patryk5675/test5
--- license: gfdl ---
Tippawan/test2-data-semi-p1
--- dataset_info: features: - name: tokens sequence: string - name: ner_tags sequence: int64 - name: prob sequence: float64 - name: ifpass sequence: int64 - name: pred dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 10279348 num_examples: 6601 download_size: 1455542 dataset_size: 10279348 configs: - config_name: default data_files: - split: train path: data/train-* ---
llm-blender/PairRM-2.7B-data
--- configs: - config_name: full_train data_files: - split: train path: "pair_sft_data.train.Nectar-full.jsonl" - config_name: ht_train data_files: - split: train path: "pair_sft_data.train.Nectar-head-tail.jsonl" # - split: train_ht # - config_name: herm data_files: - split: train path: "pair_sft_data.test.herm.jsonl" # - config_name: herm_test # data_files: # - split: test # path: "pair_sft_data.test.herm.jsonl" dataset_info: - config_name: train features: - name: id dtype: string - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: meta_data struct: - name: _instruction dtype: string - name: _response_A dtype: string - name: _response_B dtype: string - name: _label dtype: string - name: _subset dtype: string - config_name: herm_test features: - name: id dtype: int32 - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: meta_data struct: - name: _instruction dtype: string - name: _response_A dtype: string - name: _response_B dtype: string - name: _label dtype: string - name: _subset dtype: string - name: _category dtype: string ---
Ajax101/ChineseWebText
--- license: mit ---
jubba/ev-skins-blip-lg
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 13554378.0 num_examples: 215 download_size: 13363408 dataset_size: 13554378.0 --- # Dataset Card for "ev-skins-blip-lg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
shi3z/ja_testqa
--- license: mit ---
usvsnsp/semantic-duplicates
--- dataset_info: features: - name: index dtype: int64 - name: 0.9_frequencies dtype: int64 - name: 0.8_frequencies dtype: int64 splits: - name: duped_2.8b_snowclones num_bytes: 40201848 num_examples: 1675077 - name: duped_6.9b_templates num_bytes: 50903256 num_examples: 2120969 - name: deduped_6.9b_templates num_bytes: 40327056 num_examples: 1680294 - name: deduped_1.4b_templates num_bytes: 25154328 num_examples: 1048097 - name: deduped_snowclones num_bytes: 120000000 num_examples: 5000000 - name: duped_1b_templates num_bytes: 30147384 num_examples: 1256141 - name: duped_12b_templates num_bytes: 57175824 num_examples: 2382326 - name: deduped_160m_snowclones num_bytes: 13948680 num_examples: 581195 - name: deduped_1b_snowclones num_bytes: 24788760 num_examples: 1032865 - name: duped_70m_snowclones num_bytes: 11134872 num_examples: 463953 - name: deduped_1.4b_snowclones num_bytes: 25154328 num_examples: 1048097 - name: duped_1.4b_templates num_bytes: 32969328 num_examples: 1373722 - name: deduped_1b_templates num_bytes: 24788760 num_examples: 1032865 - name: deduped_2.8b_templates num_bytes: 32525064 num_examples: 1355211 - name: duped_2.8b_templates num_bytes: 40201848 num_examples: 1675077 - name: duped_6.9b_snowclones num_bytes: 50903256 num_examples: 2120969 - name: duped_410m_snowclones num_bytes: 23288184 num_examples: 970341 - name: deduped_410m_templates num_bytes: 19464936 num_examples: 811039 - name: duped_410m_templates num_bytes: 23288184 num_examples: 970341 - name: deduped_160m_templates num_bytes: 13948680 num_examples: 581195 - name: deduped_70m_templates num_bytes: 9874752 num_examples: 411448 - name: duped_160m_templates num_bytes: 16552152 num_examples: 689673 - name: duped_12b_snowclones num_bytes: 57175824 num_examples: 2382326 - name: duped_snowclones num_bytes: 120000000 num_examples: 5000000 - name: deduped_2.8b_snowclones num_bytes: 32525064 num_examples: 1355211 - name: deduped_410m_snowclones num_bytes: 19464936 num_examples: 811039 - name: duped_160m_snowclones num_bytes: 16552152 num_examples: 689673 - name: deduped_6.9b_snowclones num_bytes: 40327056 num_examples: 1680294 - name: deduped_70m_snowclones num_bytes: 9874752 num_examples: 411448 - name: duped_1b_snowclones num_bytes: 30147384 num_examples: 1256141 - name: duped_1.4b_snowclones num_bytes: 32969328 num_examples: 1373722 - name: duped_70m_templates num_bytes: 11134872 num_examples: 463953 - name: duped_templates num_bytes: 120000000 num_examples: 5000000 - name: deduped_templates num_bytes: 120000000 num_examples: 5000000 - name: deduped_12b_templates num_bytes: 44909160 num_examples: 1871215 - name: deduped_12b_snowclones num_bytes: 44909160 num_examples: 1871215 download_size: 531300635 dataset_size: 1516549488 configs: - config_name: default data_files: - split: duped_2.8b_snowclones path: data/duped_2.8b_snowclones-* - split: duped_6.9b_templates path: data/duped_6.9b_templates-* - split: deduped_6.9b_templates path: data/deduped_6.9b_templates-* - split: deduped_1.4b_templates path: data/deduped_1.4b_templates-* - split: deduped_snowclones path: data/deduped_snowclones-* - split: duped_1b_templates path: data/duped_1b_templates-* - split: duped_12b_templates path: data/duped_12b_templates-* - split: deduped_160m_snowclones path: data/deduped_160m_snowclones-* - split: deduped_1b_snowclones path: data/deduped_1b_snowclones-* - split: duped_70m_snowclones path: data/duped_70m_snowclones-* - split: deduped_1.4b_snowclones path: data/deduped_1.4b_snowclones-* - split: duped_1.4b_templates path: data/duped_1.4b_templates-* - split: deduped_1b_templates path: data/deduped_1b_templates-* - split: deduped_2.8b_templates path: data/deduped_2.8b_templates-* - split: duped_2.8b_templates path: data/duped_2.8b_templates-* - split: duped_6.9b_snowclones path: data/duped_6.9b_snowclones-* - split: duped_410m_snowclones path: data/duped_410m_snowclones-* - split: deduped_410m_templates path: data/deduped_410m_templates-* - split: duped_410m_templates path: data/duped_410m_templates-* - split: deduped_160m_templates path: data/deduped_160m_templates-* - split: deduped_70m_templates path: data/deduped_70m_templates-* - split: duped_160m_templates path: data/duped_160m_templates-* - split: duped_12b_snowclones path: data/duped_12b_snowclones-* - split: duped_snowclones path: data/duped_snowclones-* - split: deduped_2.8b_snowclones path: data/deduped_2.8b_snowclones-* - split: deduped_410m_snowclones path: data/deduped_410m_snowclones-* - split: duped_160m_snowclones path: data/duped_160m_snowclones-* - split: deduped_6.9b_snowclones path: data/deduped_6.9b_snowclones-* - split: deduped_70m_snowclones path: data/deduped_70m_snowclones-* - split: duped_1b_snowclones path: data/duped_1b_snowclones-* - split: duped_1.4b_snowclones path: data/duped_1.4b_snowclones-* - split: duped_70m_templates path: data/duped_70m_templates-* - split: duped_templates path: data/duped_templates-* - split: deduped_templates path: data/deduped_templates-* - split: deduped_12b_templates path: data/deduped_12b_templates-* - split: deduped_12b_snowclones path: data/deduped_12b_snowclones-* ---
open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored
--- pretty_name: Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [azale-ai/DukunLM-7B-V1.0-Uncensored](https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-04T03:32:00.345040](https://huggingface.co/datasets/open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored/blob/main/results_2024-02-04T03-32-00.345040.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4018543967795034,\n\ \ \"acc_stderr\": 0.0342455004465676,\n \"acc_norm\": 0.4061924250003242,\n\ \ \"acc_norm_stderr\": 0.03506469624535442,\n \"mc1\": 0.2998776009791922,\n\ \ \"mc1_stderr\": 0.016040352966713634,\n \"mc2\": 0.43947585501681957,\n\ \ \"mc2_stderr\": 0.015779310526247342\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370056,\n\ \ \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.014607794914013053\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.573590918143796,\n\ \ \"acc_stderr\": 0.004935439955031695,\n \"acc_norm\": 0.7562238597888866,\n\ \ \"acc_norm_stderr\": 0.0042848172384067134\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\ \ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.4148148148148148,\n\ \ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\ \ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255654,\n\ \ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255654\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\ \ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n\ \ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\ \ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\ \ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\ \ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\ \ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n\ \ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\ acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\ \ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\ \ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n\ \ \"acc_stderr\": 0.027709359675032488,\n \"acc_norm\": 0.3870967741935484,\n\ \ \"acc_norm_stderr\": 0.027709359675032488\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\ \ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\ : 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.03898531605579418,\n\ \ \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.03898531605579418\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563918,\n \"\ acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563918\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.03602573571288441,\n\ \ \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.03602573571288441\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941176,\n\ \ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941176\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959302,\n \ \ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959302\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.03149930577784906,\n\ \ \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.03149930577784906\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\ acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.45504587155963305,\n \"acc_stderr\": 0.021350503090925167,\n \"\ acc_norm\": 0.45504587155963305,\n \"acc_norm_stderr\": 0.021350503090925167\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\ : 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4950980392156863,\n\ \ \"acc_stderr\": 0.035091433756067866,\n \"acc_norm\": 0.4950980392156863,\n\ \ \"acc_norm_stderr\": 0.035091433756067866\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.5063291139240507,\n \"acc_stderr\": 0.032544620107678585,\n\ \ \"acc_norm\": 0.5063291139240507,\n \"acc_norm_stderr\": 0.032544620107678585\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\ \ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\ \ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n\ \ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\ acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\ \ \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n\ \ \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\ \ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.04948637324026637,\n\ \ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.04948637324026637\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5683760683760684,\n\ \ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.5683760683760684,\n\ \ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\ : {\n \"acc\": 0.545338441890166,\n \"acc_stderr\": 0.0178063045850526,\n\ \ \"acc_norm\": 0.545338441890166,\n \"acc_norm_stderr\": 0.0178063045850526\n\ \ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.44508670520231214,\n\ \ \"acc_stderr\": 0.02675625512966377,\n \"acc_norm\": 0.44508670520231214,\n\ \ \"acc_norm_stderr\": 0.02675625512966377\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\ : {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303679,\n\ \ \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303679\n\ \ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.39869281045751637,\n\ \ \"acc_stderr\": 0.02803609227389177,\n \"acc_norm\": 0.39869281045751637,\n\ \ \"acc_norm_stderr\": 0.02803609227389177\n },\n \"harness|hendrycksTest-philosophy|5\"\ : {\n \"acc\": 0.43729903536977494,\n \"acc_stderr\": 0.02817391776176289,\n\ \ \"acc_norm\": 0.43729903536977494,\n \"acc_norm_stderr\": 0.02817391776176289\n\ \ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44753086419753085,\n\ \ \"acc_stderr\": 0.027667138569422704,\n \"acc_norm\": 0.44753086419753085,\n\ \ \"acc_norm_stderr\": 0.027667138569422704\n },\n \"harness|hendrycksTest-professional_accounting|5\"\ : {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n\ \ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n\ \ \"acc_stderr\": 0.012014142101842963,\n \"acc_norm\": 0.3305084745762712,\n\ \ \"acc_norm_stderr\": 0.012014142101842963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254184,\n\ \ \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254184\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.3954248366013072,\n \"acc_stderr\": 0.019780465954777518,\n \ \ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.019780465954777518\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\ \ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\ \ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.031130880396235933,\n\ \ \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.031130880396235933\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n\ \ \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.5124378109452736,\n\ \ \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\ \ \"acc_stderr\": 0.0368078369072758,\n \"acc_norm\": 0.3373493975903614,\n\ \ \"acc_norm_stderr\": 0.0368078369072758\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.038110796698335316,\n\ \ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.038110796698335316\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\ \ \"mc1_stderr\": 0.016040352966713634,\n \"mc2\": 0.43947585501681957,\n\ \ \"mc2_stderr\": 0.015779310526247342\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325307\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.060652009097801364,\n \ \ \"acc_stderr\": 0.006574733381405782\n }\n}\n```" repo_url: https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|arc:challenge|25_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-04T03-32-00.345040.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|gsm8k|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hellaswag|10_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T03-32-00.345040.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|truthfulqa:mc|0_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-04T03-32-00.345040.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_04T03_32_00.345040 path: - '**/details_harness|winogrande|5_2024-02-04T03-32-00.345040.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-04T03-32-00.345040.parquet' - config_name: results data_files: - split: 2024_02_04T03_32_00.345040 path: - results_2024-02-04T03-32-00.345040.parquet - split: latest path: - results_2024-02-04T03-32-00.345040.parquet --- # Dataset Card for Evaluation run of azale-ai/DukunLM-7B-V1.0-Uncensored <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [azale-ai/DukunLM-7B-V1.0-Uncensored](https://huggingface.co/azale-ai/DukunLM-7B-V1.0-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-04T03:32:00.345040](https://huggingface.co/datasets/open-llm-leaderboard/details_azale-ai__DukunLM-7B-V1.0-Uncensored/blob/main/results_2024-02-04T03-32-00.345040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4018543967795034, "acc_stderr": 0.0342455004465676, "acc_norm": 0.4061924250003242, "acc_norm_stderr": 0.03506469624535442, "mc1": 0.2998776009791922, "mc1_stderr": 0.016040352966713634, "mc2": 0.43947585501681957, "mc2_stderr": 0.015779310526247342 }, "harness|arc:challenge|25": { "acc": 0.4854948805460751, "acc_stderr": 0.014605241081370056, "acc_norm": 0.5110921501706485, "acc_norm_stderr": 0.014607794914013053 }, "harness|hellaswag|10": { "acc": 0.573590918143796, "acc_stderr": 0.004935439955031695, "acc_norm": 0.7562238597888866, "acc_norm_stderr": 0.0042848172384067134 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4148148148148148, "acc_stderr": 0.04256193767901408, "acc_norm": 0.4148148148148148, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3815789473684211, "acc_stderr": 0.03953173377749194, "acc_norm": 0.3815789473684211, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4679245283018868, "acc_stderr": 0.03070948699255654, "acc_norm": 0.4679245283018868, "acc_norm_stderr": 0.03070948699255654 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4027777777777778, "acc_stderr": 0.04101405519842425, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.04101405519842425 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.27167630057803466, "acc_stderr": 0.03391750322321659, "acc_norm": 0.27167630057803466, "acc_norm_stderr": 0.03391750322321659 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.39148936170212767, "acc_stderr": 0.03190701242326812, "acc_norm": 0.39148936170212767, "acc_norm_stderr": 0.03190701242326812 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220554, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220554 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3310344827586207, "acc_stderr": 0.03921545312467122, "acc_norm": 0.3310344827586207, "acc_norm_stderr": 0.03921545312467122 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643895, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643895 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.041349130183033156, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.041349130183033156 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3870967741935484, "acc_stderr": 0.027709359675032488, "acc_norm": 0.3870967741935484, "acc_norm_stderr": 0.027709359675032488 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3054187192118227, "acc_stderr": 0.03240661565868408, "acc_norm": 0.3054187192118227, "acc_norm_stderr": 0.03240661565868408 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5272727272727272, "acc_stderr": 0.03898531605579418, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.03898531605579418 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4797979797979798, "acc_stderr": 0.03559443565563918, "acc_norm": 0.4797979797979798, "acc_norm_stderr": 0.03559443565563918 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5284974093264249, "acc_stderr": 0.03602573571288441, "acc_norm": 0.5284974093264249, "acc_norm_stderr": 0.03602573571288441 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.34615384615384615, "acc_stderr": 0.024121125416941176, "acc_norm": 0.34615384615384615, "acc_norm_stderr": 0.024121125416941176 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.025787874220959302, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.025787874220959302 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.37815126050420167, "acc_stderr": 0.03149930577784906, "acc_norm": 0.37815126050420167, "acc_norm_stderr": 0.03149930577784906 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696545, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696545 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.45504587155963305, "acc_stderr": 0.021350503090925167, "acc_norm": 0.45504587155963305, "acc_norm_stderr": 0.021350503090925167 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3194444444444444, "acc_stderr": 0.0317987634217685, "acc_norm": 0.3194444444444444, "acc_norm_stderr": 0.0317987634217685 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4950980392156863, "acc_stderr": 0.035091433756067866, "acc_norm": 0.4950980392156863, "acc_norm_stderr": 0.035091433756067866 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5063291139240507, "acc_stderr": 0.032544620107678585, "acc_norm": 0.5063291139240507, "acc_norm_stderr": 0.032544620107678585 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4977578475336323, "acc_stderr": 0.033557465352232634, "acc_norm": 0.4977578475336323, "acc_norm_stderr": 0.033557465352232634 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4580152671755725, "acc_stderr": 0.04369802690578757, "acc_norm": 0.4580152671755725, "acc_norm_stderr": 0.04369802690578757 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5206611570247934, "acc_stderr": 0.04560456086387235, "acc_norm": 0.5206611570247934, "acc_norm_stderr": 0.04560456086387235 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4537037037037037, "acc_stderr": 0.04812917324536821, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.04812917324536821 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4539877300613497, "acc_stderr": 0.0391170190467718, "acc_norm": 0.4539877300613497, "acc_norm_stderr": 0.0391170190467718 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.4854368932038835, "acc_stderr": 0.04948637324026637, "acc_norm": 0.4854368932038835, "acc_norm_stderr": 0.04948637324026637 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5683760683760684, "acc_stderr": 0.0324483553531149, "acc_norm": 0.5683760683760684, "acc_norm_stderr": 0.0324483553531149 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.545338441890166, "acc_stderr": 0.0178063045850526, "acc_norm": 0.545338441890166, "acc_norm_stderr": 0.0178063045850526 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.44508670520231214, "acc_stderr": 0.02675625512966377, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.02675625512966377 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2346368715083799, "acc_stderr": 0.014173044098303679, "acc_norm": 0.2346368715083799, "acc_norm_stderr": 0.014173044098303679 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.39869281045751637, "acc_stderr": 0.02803609227389177, "acc_norm": 0.39869281045751637, "acc_norm_stderr": 0.02803609227389177 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.43729903536977494, "acc_stderr": 0.02817391776176289, "acc_norm": 0.43729903536977494, "acc_norm_stderr": 0.02817391776176289 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.44753086419753085, "acc_stderr": 0.027667138569422704, "acc_norm": 0.44753086419753085, "acc_norm_stderr": 0.027667138569422704 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3120567375886525, "acc_stderr": 0.027640120545169927, "acc_norm": 0.3120567375886525, "acc_norm_stderr": 0.027640120545169927 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3305084745762712, "acc_stderr": 0.012014142101842963, "acc_norm": 0.3305084745762712, "acc_norm_stderr": 0.012014142101842963 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.34191176470588236, "acc_stderr": 0.028814722422254184, "acc_norm": 0.34191176470588236, "acc_norm_stderr": 0.028814722422254184 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3954248366013072, "acc_stderr": 0.019780465954777518, "acc_norm": 0.3954248366013072, "acc_norm_stderr": 0.019780465954777518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4727272727272727, "acc_stderr": 0.04782001791380063, "acc_norm": 0.4727272727272727, "acc_norm_stderr": 0.04782001791380063 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3836734693877551, "acc_stderr": 0.031130880396235933, "acc_norm": 0.3836734693877551, "acc_norm_stderr": 0.031130880396235933 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5124378109452736, "acc_stderr": 0.0353443984853958, "acc_norm": 0.5124378109452736, "acc_norm_stderr": 0.0353443984853958 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-virology|5": { "acc": 0.3373493975903614, "acc_stderr": 0.0368078369072758, "acc_norm": 0.3373493975903614, "acc_norm_stderr": 0.0368078369072758 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5555555555555556, "acc_stderr": 0.038110796698335316, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.038110796698335316 }, "harness|truthfulqa:mc|0": { "mc1": 0.2998776009791922, "mc1_stderr": 0.016040352966713634, "mc2": 0.43947585501681957, "mc2_stderr": 0.015779310526247342 }, "harness|winogrande|5": { "acc": 0.6953433307024467, "acc_stderr": 0.012935646499325307 }, "harness|gsm8k|5": { "acc": 0.060652009097801364, "acc_stderr": 0.006574733381405782 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
rookshanks/small-the_pile
--- dataset_info: features: - name: text dtype: string - name: meta struct: - name: perplexity_score dtype: float64 - name: pile_set_name dtype: string splits: - name: train num_bytes: 484845334.4 num_examples: 80000 - name: validation num_bytes: 60605666.8 num_examples: 10000 - name: test num_bytes: 60605666.8 num_examples: 10000 download_size: 329390472 dataset_size: 606056667.9999999 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
edarchimbaud/earnings-forecast-stocks
--- language: - en license: mit task_categories: - tabular-regression dataset_info: features: - name: symbol dtype: string - name: date dtype: string - name: id dtype: int64 - name: fiscal_end dtype: string - name: consensus_eps_forecast dtype: float64 - name: high_eps_forecast dtype: float64 - name: low_eps_forecast dtype: float64 - name: no_of_estimates dtype: int64 - name: up dtype: int64 - name: down dtype: int64 splits: - name: train num_bytes: 8431444 num_examples: 94547 download_size: 768366 dataset_size: 8431444 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "earnings-forecast-sp500" ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://edarchimbaud.substack.com - **Repository:** https://github.com/edarchimbaud - **Point of Contact:** contact@edarchimbaud.com ### Dataset Summary The earnings-forecast-sp500 dataset provides information about the earnings forecast for the S&P 500 index constituents. The dataset includes features that detail each company's fiscal end, the consensus earnings per share (EPS) forecast, the high and low EPS forecasts, the number of estimates, and the number of upward and downward revisions. ### Supported Tasks and Leaderboards [N/A] ### Languages [N/A] ## Dataset Structure ### Data Instances [N/A] ### Data Fields - symbol (string): A string representing the ticker symbol or abbreviation used to identify the company. - date (string): A string indicating the date of the forecast. - id (int64): An integer representing the unique identifier for the forecast. - fiscal_end (string): A string indicating the fiscal end date for the forecast. - consensus_eps_forecast (float64): A floating-point number representing the consensus earnings per share forecast. - high_eps_forecast (float64): A floating-point number representing the highest earnings per share forecast. - low_eps_forecast (float64): A floating-point number representing the lowest earnings per share forecast. - no_of_estimates (int64): An integer representing the number of estimates contributing to the consensus forecast. - up (int64): An integer representing the number of upward revisions to the forecast. - down (int64): An integer representing the number of downward revisions to the forecast. ### Data Splits [N/A] ## Dataset Creation ### Curation Rationale The earnings-forecast-sp500 dataset was developed to support the development of high-frequency trading algorithms and investment strategies that rely on earnings forecasts. ### Source Data #### Initial Data Collection and Normalization This data was sourced from financial data providers and normalized for consistency. ### Annotations #### Annotation process [N/A] #### Who are the annotators? [N/A] ### Personal and Sensitive Information [N/A] ## Considerations for Using the Data ### Social Impact of Dataset [N/A] ### Discussion of Biases [N/A] ### Other Known Limitations [N/A] ## Additional Information ### Dataset Curators The earnings-forecast-sp500 dataset was collected by https://edarchimbaud.substack.com. ### Licensing Information The earnings-forecast-sp500 dataset is licensed under the MIT License. ### Citation Information > https://edarchimbaud.substack.com, earnings-forecast-sp500 dataset, GitHub repository, https://github.com/edarchimbaud ### Contributions Thanks to [@edarchimbaud](https://github.com/edarchimbaud) for adding this dataset.
open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v2
--- pretty_name: Evaluation run of ChavyvAkvar/habib-DPO-v2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ChavyvAkvar/habib-DPO-v2](https://huggingface.co/ChavyvAkvar/habib-DPO-v2) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-09T07:03:45.930589](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v2/blob/main/results_2024-04-09T07-03-45.930589.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6462785276692069,\n\ \ \"acc_stderr\": 0.03220347050364123,\n \"acc_norm\": 0.6469741474313779,\n\ \ \"acc_norm_stderr\": 0.03285576920664436,\n \"mc1\": 0.4834761321909425,\n\ \ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6519332799461027,\n\ \ \"mc2_stderr\": 0.015463853571885877\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\ \ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6900019916351324,\n\ \ \"acc_stderr\": 0.0046154722103160396,\n \"acc_norm\": 0.8668591913961362,\n\ \ \"acc_norm_stderr\": 0.003390325458020255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\ \ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\ \ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.038035102483515854,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.038035102483515854\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724064,\n\ \ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724064\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\ \ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\ \ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\ \ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\ \ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\ \ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\ \ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\ \ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\ \ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\ \ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\ acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926913,\n \"\ acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926913\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \ \ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\ \ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\ \ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\ \ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\ \ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\ \ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\ \ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\ \ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\ \ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\ \ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\ \ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n\ \ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\ \ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\ \ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\ \ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \ \ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4834761321909425,\n\ \ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6519332799461027,\n\ \ \"mc2_stderr\": 0.015463853571885877\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7940015785319653,\n \"acc_stderr\": 0.011366474352008828\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6633813495072024,\n \ \ \"acc_stderr\": 0.01301646367998336\n }\n}\n```" repo_url: https://huggingface.co/ChavyvAkvar/habib-DPO-v2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|arc:challenge|25_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-09T07-03-45.930589.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|gsm8k|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hellaswag|10_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-03-45.930589.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-management|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-03-45.930589.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|truthfulqa:mc|0_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-09T07-03-45.930589.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_09T07_03_45.930589 path: - '**/details_harness|winogrande|5_2024-04-09T07-03-45.930589.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-09T07-03-45.930589.parquet' - config_name: results data_files: - split: 2024_04_09T07_03_45.930589 path: - results_2024-04-09T07-03-45.930589.parquet - split: latest path: - results_2024-04-09T07-03-45.930589.parquet --- # Dataset Card for Evaluation run of ChavyvAkvar/habib-DPO-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ChavyvAkvar/habib-DPO-v2](https://huggingface.co/ChavyvAkvar/habib-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-09T07:03:45.930589](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v2/blob/main/results_2024-04-09T07-03-45.930589.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6462785276692069, "acc_stderr": 0.03220347050364123, "acc_norm": 0.6469741474313779, "acc_norm_stderr": 0.03285576920664436, "mc1": 0.4834761321909425, "mc1_stderr": 0.017493940190057723, "mc2": 0.6519332799461027, "mc2_stderr": 0.015463853571885877 }, "harness|arc:challenge|25": { "acc": 0.658703071672355, "acc_stderr": 0.013855831287497723, "acc_norm": 0.6877133105802048, "acc_norm_stderr": 0.013542598541688065 }, "harness|hellaswag|10": { "acc": 0.6900019916351324, "acc_stderr": 0.0046154722103160396, "acc_norm": 0.8668591913961362, "acc_norm_stderr": 0.003390325458020255 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.038035102483515854, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.038035102483515854 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724064, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782648, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782648 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790486, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790486 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8532110091743119, "acc_stderr": 0.01517314184512625, "acc_norm": 0.8532110091743119, "acc_norm_stderr": 0.01517314184512625 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926913, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926913 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.025530100460233494, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.025530100460233494 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229143, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229143 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.03322015795776741, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.03322015795776741 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841403, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841403 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464076, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464076 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4480446927374302, "acc_stderr": 0.016631976628930595, "acc_norm": 0.4480446927374302, "acc_norm_stderr": 0.016631976628930595 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.026090162504279056, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.026090162504279056 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.02521804037341063, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.02521804037341063 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.025171041915309684, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.025171041915309684 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46284224250325945, "acc_stderr": 0.012734923579532067, "acc_norm": 0.46284224250325945, "acc_norm_stderr": 0.012734923579532067 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6486928104575164, "acc_stderr": 0.01931267606578655, "acc_norm": 0.6486928104575164, "acc_norm_stderr": 0.01931267606578655 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169146, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169146 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.4834761321909425, "mc1_stderr": 0.017493940190057723, "mc2": 0.6519332799461027, "mc2_stderr": 0.015463853571885877 }, "harness|winogrande|5": { "acc": 0.7940015785319653, "acc_stderr": 0.011366474352008828 }, "harness|gsm8k|5": { "acc": 0.6633813495072024, "acc_stderr": 0.01301646367998336 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
artificialguybr/muskdataplay
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 82900120.0 num_examples: 35 download_size: 80704590 dataset_size: 82900120.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
pavlichenko/oasst1_one_turn
--- dataset_info: features: - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 38624256 num_examples: 39663 - name: test num_bytes: 4360982 num_examples: 4407 download_size: 25696749 dataset_size: 42985238 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
bilal01/stamp-verification
--- dataset_info: features: - name: pixel_values dtype: image - name: label dtype: image splits: - name: train num_bytes: 1191542422.0 num_examples: 60 download_size: 332235726 dataset_size: 1191542422.0 --- # Dataset Card for "stamp-verification" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
notrichardren/truthfulness_all
--- configs: - config_name: default data_files: - split: combined path: data/combined-* - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: claim dtype: string - name: label dtype: int64 - name: dataset dtype: string - name: qa_type dtype: int64 - name: ind dtype: int64 splits: - name: combined num_bytes: 27403282 num_examples: 278491 - name: train num_bytes: 21924321 num_examples: 222792 - name: test num_bytes: 5478961 num_examples: 55699 download_size: 14478745 dataset_size: 54806564 --- # Dataset Card for "truthfulness_all" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sam-mosaic/iv4-chatml
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: source dtype: string - name: prompt dtype: string - name: response dtype: string splits: - name: train num_bytes: 2349114457.0 num_examples: 387277 - name: test num_bytes: 351904407.0 num_examples: 57556 download_size: 1361629459 dataset_size: 2701018864.0 --- # Dataset Card for "iv4-chatml" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
charchits7/test-images
--- license: artistic-2.0 ---
liuyanchen1015/MULTI_VALUE_rte_for_to_pupose
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 52268 num_examples: 112 - name: train num_bytes: 43727 num_examples: 91 download_size: 73945 dataset_size: 95995 --- # Dataset Card for "MULTI_VALUE_rte_for_to_pupose" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
B0808/MDbA_FineTuning
--- dataset_info: features: - name: audio dtype: audio - name: text dtype: string - name: label dtype: string splits: - name: test num_bytes: 2125885.0 num_examples: 5 - name: validation num_bytes: 2125915.0 num_examples: 5 - name: train num_bytes: 2125890.0 num_examples: 5 download_size: 8736282 dataset_size: 6377690.0 configs: - config_name: default data_files: - split: test path: data/test-* - split: validation path: data/validation-* - split: train path: data/train-* ---
result-kand2-sdxl-wuerst-karlo/b5ddd948
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 205 num_examples: 10 download_size: 1388 dataset_size: 205 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "b5ddd948" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-staging-eval-project-xsum-d7ddcd7b-12845710
--- type: predictions tags: - autotrain - evaluation datasets: - xsum eval_info: task: summarization model: sysresearch101/t5-large-finetuned-xsum-cnn metrics: [] dataset_name: xsum dataset_config: default dataset_split: test col_mapping: text: document target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: sysresearch101/t5-large-finetuned-xsum-cnn * Dataset: xsum * Config: default * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@sysresearch101](https://huggingface.co/sysresearch101) for evaluating this model.
LocalDoc/news_azerbaijan_2
--- language: - az license: cc-by-nc-4.0 size_categories: - 100K<n<1M task_categories: - text-generation - fill-mask pretty_name: Azerbaijani News Dataset from https://musavat.com/ tags: - news dataset_info: features: - name: id dtype: int64 - name: date dtype: string - name: category dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 1678632196 num_examples: 753359 download_size: 936135505 dataset_size: 1678632196 configs: - config_name: default data_files: - split: train path: data/train-* --- <h2>Azerbaijani News Dataset</h2> Description This dataset contains news from https://musavat.com/ in Azerbaijani language. It was created in 2024 and contains 753k news (approximately 11 million sentences). Format The dataset is provided in comma-separated values (CSV) format. Each article is represented on a new line with the following fields separated by commas: id: news unique id date: news date category: news category title: news title text: news text License Copyright of the content belongs to https://musavat.com/ resource. Citation is mandatory when using information. When you use information from this site link to the relevant required.<br> The dataset is licensed under the Creative Commons Attribution-NonCommercial 4.0 International license. This license allows you to freely share and redistribute the dataset with attribution to the source but prohibits commercial use. Contact information If you have any questions or suggestions, please contact us at [v.resad.89@gmail.com].
Nexdata/Filipino_Speaking_English_Speech_Data_by_Mobile_Phone
--- YAML tags: - copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging --- # Dataset Card for Nexdata/Filipino_Speaking_English_Speech_Data_by_Mobile_Phone ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://www.nexdata.ai/datasets/1124?source=Huggingface - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary 1000 Hours Filipino English audio data captured by mobile phones, recorded by Filipino native speakers. The recorded text is designed by linguistic experts, covering generic, interactive, on-board, home and other categories. The text has been proofread manually with high accuracy; this data set can be used for automatic speech recognition, machine translation, and voiceprint recognition. For more details, please refer to the link: https://www.nexdata.ai/datasets/1124?source=Huggingface ### Supported Tasks and Leaderboards automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR). ### Languages Filipino English ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing ### Citation Information [More Information Needed] ### Contributions
bilalahmadai/open_assistant_dataset_QA
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: question dtype: string - name: answer dtype: string splits: - name: train num_bytes: 782135 num_examples: 2000 download_size: 483861 dataset_size: 782135 --- # Dataset Card for "open_assistant_dataset_QA" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SerchiBoi/test
--- license: mit ---
Saauan/leetcode-performance
--- license: cc0-1.0 task_categories: - text-generation pretty_name: Leetcode performance dataset size_categories: - n<1K --- # Dataset card for Leetcode Performance Dataset
Delius/ChineseWebNovel
--- license: apache-2.0 task_categories: - text-generation language: - zh size_categories: - 1K<n<10K --- Chinese Web Novel Dataset Summarized by claude but converted the order for novel text extension task. WARNING!! Please be aware of the context length!!!
Gbssreejith/Birth_cm_type2_dataset
--- dataset_info: features: - name: image dtype: image - name: ground_truth dtype: string splits: - name: train num_bytes: 112737415.0 num_examples: 245 - name: val num_bytes: 10327570.0 num_examples: 28 download_size: 122206200 dataset_size: 123064985.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: val path: data/val-* ---
ChaiML/100_example_conversations
--- dataset_info: features: - name: conversation dtype: string - name: bot_label dtype: string - name: user_label dtype: string - name: description dtype: string - name: first_message dtype: string - name: prompt dtype: string - name: memory dtype: string - name: introduction dtype: string - name: name dtype: string splits: - name: train num_bytes: 394959 num_examples: 100 download_size: 217141 dataset_size: 394959 --- # Dataset Card for "100_example_conversations" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/military_machinery_prompts
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 37777094 num_examples: 100000 download_size: 4294425 dataset_size: 37777094 --- # Dataset Card for "military_machinery_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_G-reen__EXPERIMENT-SFT-m7b2-1-merged
--- pretty_name: Evaluation run of G-reen/EXPERIMENT-SFT-m7b2-1-merged dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [G-reen/EXPERIMENT-SFT-m7b2-1-merged](https://huggingface.co/G-reen/EXPERIMENT-SFT-m7b2-1-merged)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_G-reen__EXPERIMENT-SFT-m7b2-1-merged\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T14:12:35.118660](https://huggingface.co/datasets/open-llm-leaderboard/details_G-reen__EXPERIMENT-SFT-m7b2-1-merged/blob/main/results_2024-04-15T14-12-35.118660.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5658688059204553,\n\ \ \"acc_stderr\": 0.033833837163937154,\n \"acc_norm\": 0.5715157799730491,\n\ \ \"acc_norm_stderr\": 0.03455992983674417,\n \"mc1\": 0.3047735618115055,\n\ \ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.46287509015755035,\n\ \ \"mc2_stderr\": 0.014995166023377332\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n\ \ \"acc_norm\": 0.568259385665529,\n \"acc_norm_stderr\": 0.014474591427196206\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.602867954590719,\n\ \ \"acc_stderr\": 0.004883037758919966,\n \"acc_norm\": 0.797450707030472,\n\ \ \"acc_norm_stderr\": 0.004010779679661521\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\ \ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\ \ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365252,\n\ \ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365252\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\ \ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\ \ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\ \ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\ acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\ \ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\ \ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\ \ \"acc_stderr\": 0.026860206444724342,\n \"acc_norm\": 0.6645161290322581,\n\ \ \"acc_norm_stderr\": 0.026860206444724342\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\ : 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\ \ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\ \ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\ \ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371217,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371217\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \ \ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510182,\n \"\ acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510182\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399812,\n \"\ acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399812\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293437,\n \ \ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293437\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\ \ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\ \ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\ \ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\ \ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\ \ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\ \ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326469,\n\ \ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326469\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\ \ \"acc_stderr\": 0.015411308769686933,\n \"acc_norm\": 0.7535121328224776,\n\ \ \"acc_norm_stderr\": 0.015411308769686933\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584197,\n\ \ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584197\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\ \ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\ \ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602663,\n\ \ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602663\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\ \ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\ \ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402588,\n\ \ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402588\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \ \ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n\ \ \"acc_stderr\": 0.012576779494860087,\n \"acc_norm\": 0.4132985658409387,\n\ \ \"acc_norm_stderr\": 0.012576779494860087\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\ \ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \ \ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547724,\n\ \ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547724\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\ \ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\ \ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\ \ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\ \ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\ \ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\ \ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.46287509015755035,\n\ \ \"mc2_stderr\": 0.014995166023377332\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2532221379833207,\n \ \ \"acc_stderr\": 0.011978125194299687\n }\n}\n```" repo_url: https://huggingface.co/G-reen/EXPERIMENT-SFT-m7b2-1-merged leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|arc:challenge|25_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T14-12-35.118660.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|gsm8k|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hellaswag|10_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-12-35.118660.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-12-35.118660.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T14-12-35.118660.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_15T14_12_35.118660 path: - '**/details_harness|winogrande|5_2024-04-15T14-12-35.118660.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T14-12-35.118660.parquet' - config_name: results data_files: - split: 2024_04_15T14_12_35.118660 path: - results_2024-04-15T14-12-35.118660.parquet - split: latest path: - results_2024-04-15T14-12-35.118660.parquet --- # Dataset Card for Evaluation run of G-reen/EXPERIMENT-SFT-m7b2-1-merged <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [G-reen/EXPERIMENT-SFT-m7b2-1-merged](https://huggingface.co/G-reen/EXPERIMENT-SFT-m7b2-1-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_G-reen__EXPERIMENT-SFT-m7b2-1-merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T14:12:35.118660](https://huggingface.co/datasets/open-llm-leaderboard/details_G-reen__EXPERIMENT-SFT-m7b2-1-merged/blob/main/results_2024-04-15T14-12-35.118660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5658688059204553, "acc_stderr": 0.033833837163937154, "acc_norm": 0.5715157799730491, "acc_norm_stderr": 0.03455992983674417, "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.46287509015755035, "mc2_stderr": 0.014995166023377332 }, "harness|arc:challenge|25": { "acc": 0.5418088737201365, "acc_stderr": 0.014560220308714698, "acc_norm": 0.568259385665529, "acc_norm_stderr": 0.014474591427196206 }, "harness|hellaswag|10": { "acc": 0.602867954590719, "acc_stderr": 0.004883037758919966, "acc_norm": 0.797450707030472, "acc_norm_stderr": 0.004010779679661521 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5855263157894737, "acc_stderr": 0.04008973785779206, "acc_norm": 0.5855263157894737, "acc_norm_stderr": 0.04008973785779206 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365252, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365252 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5317919075144508, "acc_stderr": 0.03804749744364764, "acc_norm": 0.5317919075144508, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4808510638297872, "acc_stderr": 0.03266204299064678, "acc_norm": 0.4808510638297872, "acc_norm_stderr": 0.03266204299064678 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.024870815251057093, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.024870815251057093 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6645161290322581, "acc_stderr": 0.026860206444724342, "acc_norm": 0.6645161290322581, "acc_norm_stderr": 0.026860206444724342 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6545454545454545, "acc_stderr": 0.03713158067481913, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.03713158067481913 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8186528497409327, "acc_stderr": 0.02780703236068609, "acc_norm": 0.8186528497409327, "acc_norm_stderr": 0.02780703236068609 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5435897435897435, "acc_stderr": 0.025254485424799605, "acc_norm": 0.5435897435897435, "acc_norm_stderr": 0.025254485424799605 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.02671924078371217, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.02671924078371217 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5462184873949579, "acc_stderr": 0.03233943468182088, "acc_norm": 0.5462184873949579, "acc_norm_stderr": 0.03233943468182088 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7357798165137615, "acc_stderr": 0.018904164171510182, "acc_norm": 0.7357798165137615, "acc_norm_stderr": 0.018904164171510182 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.03372343271653063, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.03372343271653063 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7156862745098039, "acc_stderr": 0.03166009679399812, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.03166009679399812 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.729957805907173, "acc_stderr": 0.028900721906293437, "acc_norm": 0.729957805907173, "acc_norm_stderr": 0.028900721906293437 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.032190792004199956, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6641221374045801, "acc_stderr": 0.041423137719966634, "acc_norm": 0.6641221374045801, "acc_norm_stderr": 0.041423137719966634 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6809815950920245, "acc_stderr": 0.03661997551073836, "acc_norm": 0.6809815950920245, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326469, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326469 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.025140935950335442, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.025140935950335442 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7535121328224776, "acc_stderr": 0.015411308769686933, "acc_norm": 0.7535121328224776, "acc_norm_stderr": 0.015411308769686933 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6416184971098265, "acc_stderr": 0.025816756791584197, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.025816756791584197 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3553072625698324, "acc_stderr": 0.01600698993480319, "acc_norm": 0.3553072625698324, "acc_norm_stderr": 0.01600698993480319 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.630718954248366, "acc_stderr": 0.027634176689602663, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.027634176689602663 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6527331189710611, "acc_stderr": 0.027040745502307336, "acc_norm": 0.6527331189710611, "acc_norm_stderr": 0.027040745502307336 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6172839506172839, "acc_stderr": 0.027044538138402588, "acc_norm": 0.6172839506172839, "acc_norm_stderr": 0.027044538138402588 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.029583452036284066, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.029583452036284066 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4132985658409387, "acc_stderr": 0.012576779494860087, "acc_norm": 0.4132985658409387, "acc_norm_stderr": 0.012576779494860087 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5588235294117647, "acc_stderr": 0.030161911930767105, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.030161911930767105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.020102583895887188, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.020102583895887188 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6571428571428571, "acc_stderr": 0.030387262919547724, "acc_norm": 0.6571428571428571, "acc_norm_stderr": 0.030387262919547724 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7960199004975125, "acc_stderr": 0.02849317624532607, "acc_norm": 0.7960199004975125, "acc_norm_stderr": 0.02849317624532607 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7426900584795322, "acc_stderr": 0.03352799844161865, "acc_norm": 0.7426900584795322, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.46287509015755035, "mc2_stderr": 0.014995166023377332 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183524 }, "harness|gsm8k|5": { "acc": 0.2532221379833207, "acc_stderr": 0.011978125194299687 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
zolak/twitter_dataset_50_1713199505
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 3717080 num_examples: 8978 download_size: 1840717 dataset_size: 3717080 configs: - config_name: default data_files: - split: train path: data/train-* ---
cp500/synthetic_hebrew_medical_text
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 12496549 num_examples: 4811 download_size: 5944521 dataset_size: 12496549 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "synthetic_hebrew_medical_text" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hippocrates/Pancreatic_test
--- dataset_info: features: - name: id dtype: string - name: query dtype: string - name: answer dtype: string splits: - name: train num_bytes: 334525 num_examples: 95 - name: valid num_bytes: 334525 num_examples: 95 - name: test num_bytes: 334525 num_examples: 95 download_size: 226695 dataset_size: 1003575 configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* - split: test path: data/test-* ---
emma7033/test
--- license: afl-3.0 ---
imodels/multitask-tabular-datasets
--- license: mit --- This is a port of the Multi-Label Classification Dataset Repository ([link](https://www.uco.es/kdis/mllresources/#EnronDesc)). - We convert the datasets from there to simple csvs, resulting in 32 csvs (many of their mulan files fail to parse into python for us) - The targets in each csv are labeled with the suffix __target | | Dataset | Domain | m | d | q | Card | Dens | Div | avgIR | rDep | m×q×d | |---:|:------------------|:-----------|------:|-----:|----:|-------:|-------:|------:|--------:|-------:|--------------:| | 0 | 3s-bbc1000 | Text | 352 | 1000 | 6 | 1.125 | 0.188 | 0.234 | 1.718 | 0.733 | 2.11e+06 | | 1 | 3s-guardian1000 | Text | 302 | 1000 | 6 | 1.126 | 0.188 | 0.219 | 1.773 | 0.667 | 1.81e+06 | | 2 | 3s-inter3000 | Text | 169 | 3000 | 6 | 1.142 | 0.19 | 0.172 | 1.766 | 0.4 | 3.04e+06 | | 3 | 3s-reuters1000 | Text | 294 | 1000 | 6 | 1.126 | 0.188 | 0.219 | 1.789 | 0.667 | 1.76e+06 | | 4 | birds | Audio | 645 | 260 | 19 | 1.014 | 0.053 | 0.206 | 5.407 | 0.123 | 3.19e+06 | | 5 | cal500 | Music | 502 | 68 | 174 | 26.044 | 0.15 | 1 | 20.578 | 0.192 | 5.94e+06 | | 6 | chd_49 | Medicine | 555 | 49 | 6 | 2.58 | 0.43 | 0.531 | 5.766 | 0.267 | 163000 | | 7 | corel16k001 | Image | 13770 | 500 | 153 | 2.859 | 0.019 | 0.349 | 34.155 | 0.142 | 1.05e+09 | | 8 | corel16k002 | Image | 13760 | 500 | 164 | 2.882 | 0.018 | 0.354 | 37.678 | 0.128 | 1.13e+09 | | 9 | corel16k003 | Image | 13760 | 500 | 154 | 2.829 | 0.018 | 0.35 | 37.058 | 0.137 | 1.06e+09 | | 10 | corel16k004 | Image | 13840 | 500 | 162 | 2.842 | 0.018 | 0.351 | 35.899 | 0.126 | 1.12e+09 | | 11 | corel16k005 | Image | 13850 | 500 | 160 | 2.858 | 0.018 | 0.364 | 34.936 | 0.133 | 1.11e+09 | | 12 | corel16k006 | Image | 13860 | 500 | 162 | 2.885 | 0.018 | 0.361 | 33.398 | 0.128 | 1.12e+09 | | 13 | corel16k007 | Image | 13920 | 500 | 174 | 2.886 | 0.017 | 0.371 | 37.715 | 0.12 | 1.21e+09 | | 14 | corel16k008 | Image | 13860 | 500 | 168 | 2.883 | 0.017 | 0.357 | 36.2 | 0.121 | 1.16e+09 | | 15 | corel16k009 | Image | 13880 | 500 | 173 | 2.93 | 0.017 | 0.373 | 36.446 | 0.119 | 1.2e+09 | | 16 | corel16k010 | Image | 13620 | 500 | 144 | 2.815 | 0.02 | 0.345 | 32.998 | 0.147 | 9.81e+08 | | 17 | corel5k | Image | 5000 | 499 | 374 | 3.522 | 0.009 | 0.635 | 189.568 | 0.03 | 9.33e+08 | | 18 | emotions | Music | 593 | 72 | 6 | 1.868 | 0.311 | 0.422 | 1.478 | 0.933 | 256000 | | 19 | flags | Image | 194 | 19 | 7 | 3.392 | 0.485 | 0.422 | 2.255 | 0.381 | 25800 | | 20 | foodtruck | Recommend. | 407 | 21 | 12 | 2.29 | 0.191 | 0.285 | 7.095 | 0.409 | 103000 | | 21 | genbase | Biology | 662 | 1186 | 27 | 1.252 | 0.046 | 0.048 | 37.315 | 0.157 | 2.12e+07 | | 22 | image | Image | 2000 | 294 | 5 | 1.236 | 0.247 | 0.625 | 1.193 | 0.9 | 2.94e+06 | | 23 | mediamill | Video | 43910 | 120 | 101 | 4.376 | 0.043 | 0.149 | 256.405 | 0.342 | 5.32e+08 | | 24 | scene | Image | 2407 | 294 | 6 | 1.074 | 0.179 | 0.234 | 1.254 | 0.933 | 4.25e+06 | | 25 | stackex_chemistry | Text | 6961 | 540 | 175 | 2.109 | 0.012 | 0.436 | 56.878 | 0.056 | 6.58e+08 | | 26 | stackex_chess | Text | 1675 | 585 | 227 | 2.411 | 0.011 | 0.644 | 85.79 | 0.03 | 2.22e+08 | | 27 | stackex_cooking | Text | 10490 | 577 | 400 | 2.225 | 0.006 | 0.609 | 37.858 | 0.034 | 2.42e+09 | | 28 | stackex_cs | Text | 9270 | 635 | 274 | 2.556 | 0.009 | 0.512 | 85.002 | 0.049 | 1.61e+09 | | 29 | water-quality | Chemistry | 1060 | 16 | 14 | 5.073 | 0.362 | 0.778 | 1.767 | 0.473 | 237000 | | 30 | yeast | Biology | 2417 | 103 | 14 | 4.237 | 0.303 | 0.082 | 7.197 | 0.67 | 3.49e+06 | | 31 | yelp | Text | 10810 | 671 | 5 | 1.638 | 0.328 | 1 | 2.876 | 0.7 | 3.63e+07 | Explanation of the datasets is given below, copied from the Multi-Label Classification Dataset Repository ([link](https://www.uco.es/kdis/mllresources/#EnronDesc)). For each dataset we provide a short description as well as some characterization metrics. It includes the number of instances (m), number of attributes (d), number of labels (q), cardinality (Card), density (Dens), diversity (Div), average Imbalance Ratio per label (avgIR), ratio of unconditionally dependent label pairs by chi-square test (rDep) and complexity, defined as m × q × d as in [Read 2010]. Cardinality measures the average number of labels associated with each instance, and density is defined as cardinality divided by the number of labels. Diversity represents the percentage of labelsets present in the dataset divided by the number of possible labelsets. The avgIR measures the average degree of imbalance of all labels, the greater avgIR, the greater the imbalance of the dataset. Finally, rDep measures the proportion of pairs of labels that are dependent at 99% confidence. A broader description of all the characterization metrics and the used partition methods are described in the MLDA documentation. We also used MLDA for the characterization and partitioning of the datasets. Description of the datasets 20NG [Lang 2008]: is a compilation of around 20000 post to 20Newsgroups. Around 1000 posts are available for each group. 3sources [Greene et al. 2009]: These datasets includes 948 news articles covering 416 distinct news stories from the period February–April 2009. They have been collected from 3 sources: BBC, Reuters and The Guardian. Of these stories, 169 were reported in all three sources, 194 in two sources, and 53 appeared in a single news source. Each story was manually annotated with one or more of the six topical labels: business, entertainment, health, politics, sport, technology. In this way, three datasets with the news from BBC, Reuters and The Guardian respectively are created. A feature selection method has been performed in order to reduce the feature space and achieve a better performance. Each dataset has been selected 1000 features. Also, a dataset with the intersection (3sources-inter3000) of these three datasets (news which are in all three sources) has been created with the union of the 1000 features of each one of the datasets. The 3soures-inter3000 dataset can be also considered as a Multi-View Multi-Label (MVML) dataset, since it includes features from 3 distinct sources. The original data has been downloaded from http://mlg.ucd.ie/datasets/3sources.html Bibtex [Katakis et al. 2008]: This dataset is based on the data of the ECML/PKDD 2008 discovery challenge. It contains 7395 bibtex entries from the BibSonomy social bookmark and publication sharing system, annotated with a subset of the tags assigned by BibSonomy users. Birds [Briggs et al. 2013]: It is a dataset to predict the set of birds species that are present, given a ten-second audio clip. Bookmarks [Katakis et al. 2008]: Is based on the data of the ECML/PKDD 2008 discovery challenge and contains bookmark entries from the Bibsonomy system. CHD_49 [Shao et al. 2013]: This dataset has information of coronary heart disease (CHD) in traditional Chinese medicine (TCM). This dataset has been filtered by specialist removing irrelevant features, keeping only 49 features. CAL500 [Turnbull et al. 2008]: It is a music dataset, composed by 502 songs. Each one was manually annotated by at least three human annotators, who employ a vocabulary of 174 tags concerning to semantic concepts. These tags span 6 semantic categories: instrumentation, vocal characteristics, genres, emotions, acoustic quality of the song, and usage terms. Corel5k [Duygulu et al. 2002]: Corel5k is a popular benchmark for image classification and annotation methods. It is based in 5000 Corel images. Corel16k [Barnard et al. 2003] is derived from the popular benchmark dataset ECCV 2002 by eliminating less frequently appeared labels. Delicious [Tsoumakas et al. 2008]: This dataset contains textual data of web pages along with their tags. Emotions [Tsoumakas et al. 2008]: Also called Music in [Read 2010]. Is a small dataset to classify music into emotions that it evokes according to the Tellegen-Watson-Clark model of mood: amazed-suprised, happy-pleased, relaxing-calm, quiet-still, sad-lonely and angry-aggresive. It consists of 593 songs with 6 classes. Enron [Read et al. 2008]: The Enron dataset is a subset of Enron email Corpus, labelled with a set of categories. It is based in a collection of email messages that were categorized into 53 topic categories, such as company strategy, humour and legal advice. Eukaryote [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 7766 sequences for Eukaryote species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 22 subcellular locations (acrosome, cell membrane, cell wall, centrosome, chloroplast, cyanelle, cytoplasm, cytoeskeleton, endoplasmatic reticulum, endosome, extracell, golgi apparatus, hydrogenosome, lysosome, melanosome, microsome, mitochondrion, nucleus, peroxisome, spindle pole body, synapse and vacuole). EUR-Lex [Loza and Fürnkranz 2008]: The EUR-Lex text collection is a collection of 19348 documents about European Union law. It contains many different types of documents, as treaties, legislation, case-law and legislative proposals, which are indexed according to several orthogonal categorization schemes to allow for multiple search facilities. The most important categorization is provided by the EUROVOC descriptors, which form a topic hierarchy with almost 4000 categories regarding different aspects of European law. Flags [Gonçalves et al. 2013]: This dataset contains details of some countries and their flags, and the goal is to predict some of the features. The dataset was used the first time for Multi-label Classification in [Gonçalves et al. 2013], and the original dataset can be found at the UCI repository. Foodtruck [Rivolli et al. 2017]: The food truck dataset was created from the answers provided by the 407 survey participants. They either were approached in fast food festivals and popular events or anonymously received a request to fill out a questionnaire, in Portuguese, describing their personal information and preferences when it comes to their selection from food trucks. Genbase [Diplaris et al. 2005]: It is a dataset for protein function classification. Each instance is a protein and each label is a protein class. This dataset is small comparatively with the large number of labels. Gnegative [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 1392 sequences for Gram negative bacterial (Gnegative) species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 8 subcellular locations (cell inner membrane, cell outer membrane, cytoplasm, extracellular, fimbrium, flagellum, nucleoid and periplasm). Gpositive [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 519 sequences for Gram positive species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 4 subcellular locations (cell membrane, cell wall, cytoplasm and extracell). Human [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 3106 sequences for Human species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 14 subcellular locations (centriole, cytoplasm, cytoskeleton, endoplasm reticulum, endosome, extracell, golgi apparatus, lysosome, microsome, mitochondrion, nucleus, peroxisome, plasma membrace, and synapse). Image [Zhang and Zhou 2007]: This dataset is composed by 2,000 images. Concretely, each color image is firstly converted to the CIE Luv space, which is a more perceptually uniform color space such that perceived color differences correspond closely to Euclidean distances in this color space. After that, the image is divided into 49 blocks using a 7×7 grid, where in each block the first and second moments (mean and variance) of each band are computed, corresponding to a low-resolution image and to computationally inexpensive texture features respectively. Finally, each image is transformed into a 49×3×2 = 294-dimensional feature vector. IMDB [Read 2010]: It contains 120919 movie plot tex summaries from the Internet Movie Database (www.imdb.com), labelled with one or more genres. LangLog [Read 2010]: It was compiled from the Language Log Forum, which discussed various topics relating to language, and 75 topics represents the label space. Mediamill [Snoek et al. 2006]: It is a multimedia dataset for generic video indexing, which was extracted tom the TRECVID 2005/2006 benchmark. This dataset contains 85 hours of international broadcast news data categorized into 100 labels and each video instance is represented as a 120-dimensional feature vector of numeric features. Medical [Pestian et al. 2007]: The dataset is based on the data made available during the Computational Medicine Centers 2007 Medical Natural Language Processing Challenge 10 . It consists of 978 clinical free text reports labelled with one or more out of 45 disease codes. Nus-Wide [Chua et al. 2009]: We provide two versions of the full NUS-WIDE dataset. In the first version, images are represented using 500-D bag of visual words features provided by the creators of the dataset [Chua et al. 2009]. In the second version, images are represented using 128-D cVLAD+ features described in [Spyromitros et al. 2014]. In both cases, the 1st attribute is the image id. Ohsumed [Joachims 1998]: This collection includes medical abstracts from the MeSH categories of the year 1991. The specific task was to categorize the 23 cardiovascular diseases categories. Plant [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 978 sequences for Plant species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 12 subcellular locations (cell membrace, cell wall, chloroplast, cytoplasm, endoplasmic reticulum, extracellular, golgi apparatus, mitochondrion, nucleus, peroxisome, plastid, and vacuole). Reuters-RCV1 [Lewis et al. 2004]: This dataset is a well-known benchmark for text classification methods. It has 5 subsets, each one with 6000 articles assigned into one or more of 101 topics. The Reuters-K500 dataset was obtained by selecting 500 features by applying the method proposed in [Tsoumakas et al. 2007]. Scene [Boutell et al. 2004]: It is a image dataset, that contains 2407 images, annotated in up to 6 classes: beach, sunset, fall foliage, field, mountain and urban. Each image is described with 294 visual numeric features corresponding to spatial colour moments in the LUV space. Slashdot [Read 2010]: It consists of article blurbs with subject categories representing the label space, mined from http://slashdot.org. Stackex [Charte et al. 2015]: It is a collection of six datasets generated from the text collected in a selection of Stack Exchange forums. It includes stackex_chess, stackex_chemistry, stackex_coffee, stackex_cooking, stackex_cs and stackex_philosophy. TMC2007 [Srivastava et al. 2005]: It is a subset of the Aviation Safety Reporting System dataset. It contains 28596 aviation safety free text reports that the fligth crew submit after each flight about events that took place during the flight. The goal is to label the documents with respect to what types of problem they describe. The dataset has 49060 discrete attributes corresponding to terms in the collection. The safety reports are provided with 22 labels, each of them representing a problem type that appears during a flight. Also the dataset TMC2007-500, which was obtained doing a features selection of the top-500, is included. Virus [Xu et al. 2016]: This dataset is used to predict the sub-cellular locations of proteins according to their sequences. It contains 207 sequences for Virus species. Both the GO (Gene ontology) features and PseAAC (including 20 amino acid, 20 pseudo-amino acid and 400 diptide components) are provided. There are 6 subcellular locations (viral capsid, host cell membrane, host endoplasm reticulum, host cytoplasm, host nucleus and secreted). Water quality [Blockeel et al. 1999]: This dataset is used to predict the quality of water of Slovenian rivers, knowing 16 characteristics such as the temperature, ph, hardness, NO2 or C02. Yahoo [Ueda and Saito 2002]: It is a dataset to categorize web pages and consists of 14 top-level categories, each one is classified into a number of second-level categories. By focusing in second-level categories, there were used 11 out of the 14 independent text categorization problems. Yeast [Elisseeff and Weston 2001]: This dataset contains micro-array expressions and phylogenetic profiles for 2417 yeast genes. Each gen is annotated with a subset of 14 functional categories (e.g. Metabolism, energy, etc.) of the top level of the functional catalogue. Yelp [Sajnani et al. 2013]: This dataset has been obtained from the user’s reviews and ratings about business and services on Yelp. It is used in order to categorize if the food, service, ambiance, deals and price of one of these business are good or not. It contains more than 10000 reviews of users. This dataset has been downloaded from http://www.ics.uci.edu/~vpsaini/.
miazhao/prm800k_processed_preference
--- dataset_info: features: - name: instruction dtype: string - name: responses sequence: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 23805614 num_examples: 22036 download_size: 9396871 dataset_size: 23805614 --- # Dataset Card for "prm800k_processed_preference" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
msr_zhen_translation_parity
--- annotations_creators: - no-annotation language_creators: - expert-generated - machine-generated language: - en license: - ms-pl multilinguality: - monolingual - translation size_categories: - 1K<n<10K source_datasets: - extended|other-newstest2017 task_categories: - translation task_ids: [] paperswithcode_id: null pretty_name: MsrZhenTranslationParity dataset_info: features: - name: Reference-HT dtype: string - name: Reference-PE dtype: string - name: Combo-4 dtype: string - name: Combo-5 dtype: string - name: Combo-6 dtype: string - name: Online-A-1710 dtype: string splits: - name: train num_bytes: 1797033 num_examples: 2001 download_size: 0 dataset_size: 1797033 --- # Dataset Card for msr_zhen_translation_parity ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [Translator Human Parity Data](https://msropendata.com/datasets/93f9aa87-9491-45ac-81c1-6498b6be0d0b) - **Repository:** - **Paper:** [Achieving Human Parity on Automatic Chinese to English News Translation](https://www.microsoft.com/en-us/research/publication/achieving-human-parity-on-automatic-chinese-to-english-news-translation/) - **Leaderboard:** - **Point of Contact:** ### Dataset Summary > Human evaluation results and translation output for the Translator Human Parity Data release, > as described in https://blogs.microsoft.com/ai/machine-translation-news-test-set-human-parity/ > The Translator Human Parity Data release contains all human evaluation results and translations > related to our paper "Achieving Human Parity on Automatic Chinese to English News Translation", > published on March 14, 2018. We have released this data to > 1) allow external validation of our claim of having achieved human parity > 2) to foster future research by releasing two additional human references > for the Reference-WMT test set. > The dataset includes: 1) two new references for Chinese-English language pair of WMT17, one based on human translation from scratch (Reference-HT), the other based on human post-editing (Reference-PE); 2) human parity translations generated by our research systems Combo-4, Combo-5, and Combo-6, as well as translation output from online machine translation service Online-A-1710, collected on October 16, 2017; The data package provided with the study also includes (but not parsed and provided as workable features of this dataset) all data points collected in human evaluation campaigns. ### Supported Tasks and Leaderboards [More Information Needed] ### Languages This dataset contains 6 extra English translations to Chinese-English language pair of WMT17. ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields As mentioned in the summary, this dataset provides 6 extra English translations of Chinese-English language pair of WMT17. Data fields are named exactly like the associated paper for easier cross-referenceing. - `Reference-HT`: human translation from scrach. - `Reference-PE`: human post-editing. - `Combo-4`, `Combo-5`, `Combo-6`: three translations by research systems. - `Online-A-1710`: a translation from an anonymous online machine translation service. All data fields of a record are translations for the same Chinese source sentence. ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information Citation information is available at this link [Achieving Human Parity on Automatic Chinese to English News Translation](https://www.microsoft.com/en-us/research/publication/achieving-human-parity-on-automatic-chinese-to-english-news-translation/) ### Contributions Thanks to [@leoxzhao](https://github.com/leoxzhao) for adding this dataset.
jeffmeloy/py2dataset_TheAlgorithms_Python
--- license: mit --- # Dataset Card for py2sataset_TheAlgorithms_Python Dataset created using the sharegpt.json file created by py2dataset using the TheAlgorithms/Python Python source code files. - **Dataset Created Using:** https://github.com/jeffmeloy/py2dataset ## Dataset Source - **Source Code used for Dataset:** https://github.com/TheAlgorithms/Python - **License:** MIT MIT License Copyright (c) 2016-2022 TheAlgorithms and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ## Dataset Structure The dataset follows a sharegpt structure. This means it is a list of dictionaries, with each dictionary containing a new list of dicts called conversations. Each turn in a conversation has two dictionaries, a "from" field, which denotes the role of that turn, and a "value" field which contains the actual text. Here is an example of an entry for each python code file: ``` { "conversations": [ { "from": "system", "value": "code documentation:" + <code doumentation created by py2dataset> }, { "from": "human", "value": Output the Python code described by the code documentation. }, { "from": "gpt", "value": <python code file listing> } ], "nbytes": <size of conversation in bytes> "source": <source code path and filename> }, ```
jxie/shapenet55
--- dataset_info: features: - name: inputs sequence: sequence: float64 - name: labels dtype: int64 splits: - name: train num_bytes: 12035988360 num_examples: 52470 download_size: 9149702428 dataset_size: 12035988360 --- # Dataset Card for "shapenet55" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
com_qa
--- language: - en license: unknown task_categories: - question-answering paperswithcode_id: comqa pretty_name: ComQA dataset_info: features: - name: cluster_id dtype: string - name: questions sequence: string - name: answers sequence: string splits: - name: train num_bytes: 692932 num_examples: 3966 - name: test num_bytes: 271554 num_examples: 2243 - name: validation num_bytes: 131129 num_examples: 966 download_size: 474169 dataset_size: 1095615 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* --- # Dataset Card for "com_qa" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [http://qa.mpi-inf.mpg.de/comqa/](http://qa.mpi-inf.mpg.de/comqa/) - **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Paper:** https://doi.org/10.18653/v1/N19-1027 - **Paper:** https://arxiv.org/abs/1809.09528 - **Point of Contact:** [Rishiraj Saha Roy](https://people.mpi-inf.mpg.de/~rsaharo/) - **Size of downloaded dataset files:** 1.67 MB - **Size of the generated dataset:** 1.10 MB - **Total amount of disk used:** 2.78 MB ### Dataset Summary ComQA is a dataset of 11,214 questions, which were collected from WikiAnswers, a community question answering website. By collecting questions from such a site we ensure that the information needs are ones of interest to actual users. Moreover, questions posed there are often cannot be answered by commercial search engines or QA technology, making them more interesting for driving future research compared to those collected from an engine's query log. The dataset contains questions with various challenging phenomena such as the need for temporal reasoning, comparison (e.g., comparatives, superlatives, ordinals), compositionality (multiple, possibly nested, subquestions with multiple entities), and unanswerable questions (e.g., Who was the first human being on Mars?). Through a large crowdsourcing effort, questions in ComQA are grouped into 4,834 paraphrase clusters that express the same information need. Each cluster is annotated with its answer(s). ComQA answers come in the form of Wikipedia entities wherever possible. Wherever the answers are temporal or measurable quantities, TIMEX3 and the International System of Units (SI) are used for normalization. ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Dataset Structure ### Data Instances #### default - **Size of downloaded dataset files:** 1.67 MB - **Size of the generated dataset:** 1.10 MB - **Total amount of disk used:** 2.78 MB An example of 'validation' looks as follows. ``` { "answers": ["https://en.wikipedia.org/wiki/north_sea"], "cluster_id": "cluster-922", "questions": ["what sea separates the scandinavia peninsula from britain?", "which sea separates britain from scandinavia?"] } ``` ### Data Fields The data fields are the same among all splits. #### default - `cluster_id`: a `string` feature. - `questions`: a `list` of `string` features. - `answers`: a `list` of `string` features. ### Data Splits | name |train|validation|test| |-------|----:|---------:|---:| |default| 3966| 966|2243| ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @inproceedings{abujabal-etal-2019-comqa, title = "{ComQA: A Community-sourced Dataset for Complex Factoid Question Answering with Paraphrase Clusters", author = {Abujabal, Abdalghani and Saha Roy, Rishiraj and Yahya, Mohamed and Weikum, Gerhard}, booktitle = {Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)}, month = {jun}, year = {2019}, address = {Minneapolis, Minnesota}, publisher = {Association for Computational Linguistics}, url = {https://www.aclweb.org/anthology/N19-1027}, doi = {10.18653/v1/N19-1027{, pages = {307--317}, } ``` ### Contributions Thanks to [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@patrickvonplaten](https://github.com/patrickvonplaten), [@albertvillanova](https://github.com/albertvillanova) for adding this dataset.
qnguyen3/demo_faq
--- dataset_info: features: - name: conversations list: - name: from dtype: string - name: value dtype: string splits: - name: train num_bytes: 1809 num_examples: 10 download_size: 2851 dataset_size: 1809 configs: - config_name: default data_files: - split: train path: data/train-* ---
zolak/twitter_dataset_79_1713073661
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 3430266 num_examples: 8574 download_size: 1700025 dataset_size: 3430266 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/tokoro_megumi_theidolmstermillionlive
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of tokoro_megumi/所恵美/토코로메구미 (THE iDOLM@STER: Million Live!) This is the dataset of tokoro_megumi/所恵美/토코로메구미 (THE iDOLM@STER: Million Live!), containing 430 images and their tags. The core tags of this character are `long_hair, brown_hair, ahoge, blue_eyes, breasts, bangs, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 430 | 553.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tokoro_megumi_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 430 | 314.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tokoro_megumi_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1009 | 668.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tokoro_megumi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 430 | 486.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tokoro_megumi_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1009 | 977.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tokoro_megumi_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tokoro_megumi_theidolmstermillionlive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blue_sky, collarbone, day, navel, ocean, outdoors, 1girl, blush, cloud, cowboy_shot, looking_at_viewer, medium_breasts, smile, white_bikini, cleavage, hair_between_eyes, bare_arms, bare_shoulders, beach, blonde_hair, blue_bikini, blue_neckerchief, blue_sailor_collar, hand_on_hip, multiple_girls, sailor_bikini, solo_focus, standing, stomach | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, cleavage, medium_breasts, looking_at_viewer, navel, black_bikini, smile, blush, water | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | smile, earrings, 1girl, looking_at_viewer, open_mouth, solo, blush, dress, hair_flower | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, cleavage, necklace, solo, black_jacket, blush, looking_at_viewer, simple_background, blue_shorts, denim_shorts, fur-trimmed_jacket, long_sleeves, medium_breasts, open_clothes, short_shorts, white_background, ;d, coat, collarbone, one_eye_closed, open_mouth, smile, sweatdrop, tank_top | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, looking_at_viewer, solo, blush, off_shoulder, collarbone, smile, bare_shoulders, cleavage, simple_background, white_background, necklace, upper_body, black_shirt, hair_between_eyes, medium_breasts, tank_top | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, collarbone, hat, looking_at_viewer, smile, solo, bare_shoulders, choker, cleavage, dress, one_eye_closed, skirt, strapless, bracelet, earrings, medium_breasts, white_background, ;d, black_headwear, blush, cowboy_shot, heart, open_mouth, simple_background | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, looking_at_viewer, pleated_skirt, school_uniform, solo, white_shirt, blush, long_sleeves, plaid_skirt, collared_shirt, hair_between_eyes, miniskirt, cardigan, red_necktie, cowboy_shot, green_skirt, simple_background, standing, sweater, white_background, black_thighhighs, diagonal-striped_necktie, diagonal_stripes, dress_shirt, grin, sitting, zettai_ryouiki | | 7 | 15 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, 1boy, blush, hetero, nipples, smile, solo_focus, sweat, penis, looking_at_viewer, navel, sex, vaginal, nude, open_mouth, pussy, female_pubic_hair, medium_breasts, spread_legs, girl_on_top, mosaic_censoring | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blue_sky | collarbone | day | navel | ocean | outdoors | 1girl | blush | cloud | cowboy_shot | looking_at_viewer | medium_breasts | smile | white_bikini | cleavage | hair_between_eyes | bare_arms | bare_shoulders | beach | blonde_hair | blue_bikini | blue_neckerchief | blue_sailor_collar | hand_on_hip | multiple_girls | sailor_bikini | solo_focus | standing | stomach | solo | black_bikini | water | earrings | open_mouth | dress | hair_flower | necklace | black_jacket | simple_background | blue_shorts | denim_shorts | fur-trimmed_jacket | long_sleeves | open_clothes | short_shorts | white_background | ;d | coat | one_eye_closed | sweatdrop | tank_top | off_shoulder | upper_body | black_shirt | hat | choker | skirt | strapless | bracelet | black_headwear | heart | pleated_skirt | school_uniform | white_shirt | plaid_skirt | collared_shirt | miniskirt | cardigan | red_necktie | green_skirt | sweater | black_thighhighs | diagonal-striped_necktie | diagonal_stripes | dress_shirt | grin | sitting | zettai_ryouiki | 1boy | hetero | nipples | sweat | penis | sex | vaginal | nude | pussy | female_pubic_hair | spread_legs | girl_on_top | mosaic_censoring | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:-------------|:------|:--------|:--------|:-----------|:--------|:--------|:--------|:--------------|:--------------------|:-----------------|:--------|:---------------|:-----------|:--------------------|:------------|:-----------------|:--------|:--------------|:--------------|:-------------------|:---------------------|:--------------|:-----------------|:----------------|:-------------|:-----------|:----------|:-------|:---------------|:--------|:-----------|:-------------|:--------|:--------------|:-----------|:---------------|:--------------------|:--------------|:---------------|:---------------------|:---------------|:---------------|:---------------|:-------------------|:-----|:-------|:-----------------|:------------|:-----------|:---------------|:-------------|:--------------|:------|:---------|:--------|:------------|:-----------|:-----------------|:--------|:----------------|:-----------------|:--------------|:--------------|:-----------------|:------------|:-----------|:--------------|:--------------|:----------|:-------------------|:---------------------------|:-------------------|:--------------|:-------|:----------|:-----------------|:-------|:---------|:----------|:--------|:--------|:------|:----------|:-------|:--------|:--------------------|:--------------|:--------------|:-------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | | | X | | | X | X | | | X | X | X | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | | | | | | X | X | | | X | | X | | | | | | | | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | | | | | X | X | | | X | X | X | | X | | | | | | | | | | | | | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | X | | | | | X | X | | | X | X | X | | X | X | | X | | | | | | | | | | | | X | | | | | | | X | | X | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | X | | | | | X | X | | X | X | X | X | | X | | | X | | | | | | | | | | | | X | | | X | X | X | | | | X | | | | | | | X | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | | | | | | | X | X | | X | X | | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 7 | 15 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | | | | X | | | X | X | | | X | X | X | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
bigbio/medhop
--- language: - en bigbio_language: - English license: cc-by-sa-3.0 multilinguality: monolingual bigbio_license_shortname: CC_BY_SA_3p0 pretty_name: MedHop homepage: http://qangaroo.cs.ucl.ac.uk/ bigbio_pubmed: True bigbio_public: True bigbio_tasks: - QUESTION_ANSWERING --- # Dataset Card for MedHop ## Dataset Description - **Homepage:** http://qangaroo.cs.ucl.ac.uk/ - **Pubmed:** True - **Public:** True - **Tasks:** QA With the same format as WikiHop, this dataset is based on research paper abstracts from PubMed, and the queries are about interactions between pairs of drugs. The correct answer has to be inferred by combining information from a chain of reactions of drugs and proteins. ## Citation Information ``` @article{welbl-etal-2018-constructing, title = Constructing Datasets for Multi-hop Reading Comprehension Across Documents, author = Welbl, Johannes and Stenetorp, Pontus and Riedel, Sebastian, journal = Transactions of the Association for Computational Linguistics, volume = 6, year = 2018, address = Cambridge, MA, publisher = MIT Press, url = https://aclanthology.org/Q18-1021, doi = 10.1162/tacl_a_00021, pages = 287--302, abstract = { Most Reading Comprehension methods limit themselves to queries which can be answered using a single sentence, paragraph, or document. Enabling models to combine disjoint pieces of textual evidence would extend the scope of machine comprehension methods, but currently no resources exist to train and test this capability. We propose a novel task to encourage the development of models for text understanding across multiple documents and to investigate the limits of existing methods. In our task, a model learns to seek and combine evidence -- effectively performing multihop, alias multi-step, inference. We devise a methodology to produce datasets for this task, given a collection of query-answer pairs and thematically linked documents. Two datasets from different domains are induced, and we identify potential pitfalls and devise circumvention strategies. We evaluate two previously proposed competitive models and find that one can integrate information across documents. However, both models struggle to select relevant information; and providing documents guaranteed to be relevant greatly improves their performance. While the models outperform several strong baselines, their best accuracy reaches 54.5 % on an annotated test set, compared to human performance at 85.0 %, leaving ample room for improvement. } ```
akash140500/failure11
--- license: apache-2.0 ---
lmms-lab/ST-VQA
--- dataset_info: features: - name: set_name dtype: string - name: file_name dtype: string - name: question dtype: string - name: image_width dtype: string - name: dataset dtype: string - name: question_tokens dtype: string - name: image_height dtype: string - name: file_path dtype: string - name: question_id dtype: string - name: image dtype: image splits: - name: test num_bytes: 501561613.95 num_examples: 4070 download_size: 327543894 dataset_size: 501561613.95 configs: - config_name: default data_files: - split: test path: data/test-* ---
P1ayer-1/5_levels_subs
--- dataset_info: features: - name: channel_id dtype: string - name: channel_url dtype: string - name: video_name dtype: string - name: video_id dtype: string - name: duration dtype: int64 - name: chapters list: - name: end_time dtype: float64 - name: start_time dtype: float64 - name: title dtype: string - name: subtitles list: - name: text dtype: string - name: timestamp sequence: float64 - name: timed_subtitles sequence: sequence: string splits: - name: train num_bytes: 2051274 num_examples: 23 download_size: 847028 dataset_size: 2051274 --- # Dataset Card for "5_levels_subs" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
magic1992/comfyui
--- license: apache-2.0 ---
Doub7e/SD-CLIP-alignment-composition
--- dataset_info: features: - name: image dtype: image - name: prompt dtype: string - name: clip_pred dtype: string splits: - name: train num_bytes: 405174703.0 num_examples: 900 download_size: 405155460 dataset_size: 405174703.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "SD-CLIP-alignment-composition" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Libertify/stock-sight
--- dataset_info: features: - name: image dtype: image - name: company_name dtype: string - name: commercial_use dtype: bool - name: license dtype: string - name: hash dtype: string - name: source dtype: string - name: orig_text dtype: string - name: text dtype: string splits: - name: train num_bytes: 1529658753.0 num_examples: 573 download_size: 1526083802 dataset_size: 1529658753.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
autoevaluate/autoeval-eval-project-adversarial_qa-0243fffc-1303549871
--- type: predictions tags: - autotrain - evaluation datasets: - adversarial_qa eval_info: task: extractive_question_answering model: nbroad/rob-base-superqa2 metrics: [] dataset_name: adversarial_qa dataset_config: adversarialQA dataset_split: validation col_mapping: context: context question: question answers-text: answers.text answers-answer_start: answers.answer_start --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Question Answering * Model: nbroad/rob-base-superqa2 * Dataset: adversarial_qa * Config: adversarialQA * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model.
Falah/architecture_house_building_prompts_SDXL
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 342231030 num_examples: 1000000 download_size: 43996656 dataset_size: 342231030 --- # Dataset Card for "architecture_house_building_prompts_SDXL" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_lgaalves__mistral-7b_open_platypus
--- pretty_name: Evaluation run of lgaalves/mistral-7b_open_platypus dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [lgaalves/mistral-7b_open_platypus](https://huggingface.co/lgaalves/mistral-7b_open_platypus)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__mistral-7b_open_platypus_public\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-11-18T19:20:26.136874](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__mistral-7b_open_platypus_public/blob/main/results_2023-11-18T19-20-26.136874.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5921618091275235,\n\ \ \"acc_stderr\": 0.033165593817109554,\n \"acc_norm\": 0.6007436240197009,\n\ \ \"acc_norm_stderr\": 0.03392093055241413,\n \"mc1\": 0.3292533659730722,\n\ \ \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.48869138188349615,\n\ \ \"mc2_stderr\": 0.0147358552004315,\n \"em\": 0.0036703020134228187,\n\ \ \"em_stderr\": 0.0006192871806511272,\n \"f1\": 0.06589450503355675,\n\ \ \"f1_stderr\": 0.0014663770308574477\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5332764505119454,\n \"acc_stderr\": 0.014578995859605808,\n\ \ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128343\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6120294761999602,\n\ \ \"acc_stderr\": 0.004862919176408075,\n \"acc_norm\": 0.8212507468631747,\n\ \ \"acc_norm_stderr\": 0.003823591814133036\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\ \ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\ \ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\ : 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\ acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \ \ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\ \ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\ \ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709390974,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709390974\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\ \ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\ \ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\ \ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\ \ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.041657747757287644,\n\ \ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.041657747757287644\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406776,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406776\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\ \ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\ \ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\ \ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\ : 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\ \ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\ \ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n\ \ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \ \ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\ \ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\ acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\ acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \ \ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\ \ \"acc_stderr\": 0.029918586707798834,\n \"acc_norm\": 0.726457399103139,\n\ \ \"acc_norm_stderr\": 0.029918586707798834\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\ acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\ \ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\ \ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\ \ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\ \ \"acc_stderr\": 0.014419123980931895,\n \"acc_norm\": 0.7956577266922095,\n\ \ \"acc_norm_stderr\": 0.014419123980931895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\ \ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\ \ \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n\ \ \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684972,\n\ \ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684972\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\ \ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\ \ \"acc_stderr\": 0.012695244711379774,\n \"acc_norm\": 0.44589308996088656,\n\ \ \"acc_norm_stderr\": 0.012695244711379774\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\ \ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6209150326797386,\n \"acc_stderr\": 0.01962744474841223,\n \ \ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.01962744474841223\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\ \ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\ \ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\ \ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\ \ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\ \ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\ \ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n\ \ \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.48869138188349615,\n\ \ \"mc2_stderr\": 0.0147358552004315\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n\ \ },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \ \ \"em_stderr\": 0.0006192871806511272,\n \"f1\": 0.06589450503355675,\n\ \ \"f1_stderr\": 0.0014663770308574477\n },\n \"harness|gsm8k|5\":\ \ {\n \"acc\": 0.12585291887793784,\n \"acc_stderr\": 0.009136212598406307\n\ \ }\n}\n```" repo_url: https://huggingface.co/lgaalves/mistral-7b_open_platypus leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|arc:challenge|25_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-11-18T19-20-26.136874.parquet' - config_name: harness_drop_3 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|drop|3_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|drop|3_2023-11-18T19-20-26.136874.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|gsm8k|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hellaswag|10_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-18T19-20-26.136874.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-management|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-virology|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T19-20-26.136874.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|truthfulqa:mc|0_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-11-18T19-20-26.136874.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_11_18T19_20_26.136874 path: - '**/details_harness|winogrande|5_2023-11-18T19-20-26.136874.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-11-18T19-20-26.136874.parquet' - config_name: results data_files: - split: 2023_11_18T19_20_26.136874 path: - results_2023-11-18T19-20-26.136874.parquet - split: latest path: - results_2023-11-18T19-20-26.136874.parquet --- # Dataset Card for Evaluation run of lgaalves/mistral-7b_open_platypus ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lgaalves/mistral-7b_open_platypus - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [lgaalves/mistral-7b_open_platypus](https://huggingface.co/lgaalves/mistral-7b_open_platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lgaalves__mistral-7b_open_platypus_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-18T19:20:26.136874](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__mistral-7b_open_platypus_public/blob/main/results_2023-11-18T19-20-26.136874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5921618091275235, "acc_stderr": 0.033165593817109554, "acc_norm": 0.6007436240197009, "acc_norm_stderr": 0.03392093055241413, "mc1": 0.3292533659730722, "mc1_stderr": 0.016451264440068232, "mc2": 0.48869138188349615, "mc2_stderr": 0.0147358552004315, "em": 0.0036703020134228187, "em_stderr": 0.0006192871806511272, "f1": 0.06589450503355675, "f1_stderr": 0.0014663770308574477 }, "harness|arc:challenge|25": { "acc": 0.5332764505119454, "acc_stderr": 0.014578995859605808, "acc_norm": 0.5580204778156996, "acc_norm_stderr": 0.014512682523128343 }, "harness|hellaswag|10": { "acc": 0.6120294761999602, "acc_stderr": 0.004862919176408075, "acc_norm": 0.8212507468631747, "acc_norm_stderr": 0.003823591814133036 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.042763494943765995, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.042763494943765995 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849724, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849724 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6566037735849056, "acc_stderr": 0.02922452646912479, "acc_norm": 0.6566037735849056, "acc_norm_stderr": 0.02922452646912479 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6458333333333334, "acc_stderr": 0.039994111357535424, "acc_norm": 0.6458333333333334, "acc_norm_stderr": 0.039994111357535424 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709390974, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709390974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895536, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895536 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4978723404255319, "acc_stderr": 0.03268572658667492, "acc_norm": 0.4978723404255319, "acc_norm_stderr": 0.03268572658667492 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.041657747757287644, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.041657747757287644 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406776, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406776 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.026522709674667765, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.026522709674667765 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.03510766597959217, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723872, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723872 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5512820512820513, "acc_stderr": 0.025217315184846486, "acc_norm": 0.5512820512820513, "acc_norm_stderr": 0.025217315184846486 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.02742001935094528, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.02742001935094528 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.032183581077426124, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.032183581077426124 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7724770642201835, "acc_stderr": 0.017974463578776502, "acc_norm": 0.7724770642201835, "acc_norm_stderr": 0.017974463578776502 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.02955429260569507, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.02955429260569507 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676177, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676177 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.726457399103139, "acc_stderr": 0.029918586707798834, "acc_norm": 0.726457399103139, "acc_norm_stderr": 0.029918586707798834 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8429752066115702, "acc_stderr": 0.03321244842547128, "acc_norm": 0.8429752066115702, "acc_norm_stderr": 0.03321244842547128 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.044143436668549335, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503948, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503948 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8333333333333334, "acc_stderr": 0.024414947304543678, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.024414947304543678 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7956577266922095, "acc_stderr": 0.014419123980931895, "acc_norm": 0.7956577266922095, "acc_norm_stderr": 0.014419123980931895 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.376536312849162, "acc_stderr": 0.016204672385106603, "acc_norm": 0.376536312849162, "acc_norm_stderr": 0.016204672385106603 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6470588235294118, "acc_stderr": 0.027363593284684972, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.027363593284684972 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44589308996088656, "acc_stderr": 0.012695244711379774, "acc_norm": 0.44589308996088656, "acc_norm_stderr": 0.012695244711379774 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5919117647058824, "acc_stderr": 0.029855261393483924, "acc_norm": 0.5919117647058824, "acc_norm_stderr": 0.029855261393483924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6209150326797386, "acc_stderr": 0.01962744474841223, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.01962744474841223 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6244897959183674, "acc_stderr": 0.03100120903989484, "acc_norm": 0.6244897959183674, "acc_norm_stderr": 0.03100120903989484 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916714, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916714 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.0389136449583582, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.0389136449583582 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.031885780176863984, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.3292533659730722, "mc1_stderr": 0.016451264440068232, "mc2": 0.48869138188349615, "mc2_stderr": 0.0147358552004315 }, "harness|winogrande|5": { "acc": 0.7861089187056038, "acc_stderr": 0.011524466954090254 }, "harness|drop|3": { "em": 0.0036703020134228187, "em_stderr": 0.0006192871806511272, "f1": 0.06589450503355675, "f1_stderr": 0.0014663770308574477 }, "harness|gsm8k|5": { "acc": 0.12585291887793784, "acc_stderr": 0.009136212598406307 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
suolyer/pile_freelaw
--- license: apache-2.0 ---
MouhsineGT/text_new_fr_v1
--- license: unknown ---
Karavet/ARPA-Armenian-Paraphrase-Corpus
--- language: - hy task_categories: [paraphrase, paraphrase detection] multilinguality: [monolingual] task_ids: [paraphrase, paraphrase detection] --- ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Dataset Structure](#dataset-structure) - [Dataset Evaluation](#dataset-evaluation) - [Additional Information](#additional-information) ## Dataset Description We provide sentential paraphrase detection train, test datasets as well as BERT-based models for the Armenian language. ### Dataset Summary The sentences in the dataset are taken from [Hetq](https://hetq.am/) and [Panarmenian](http://www.panarmenian.net/) news articles. To generate paraphrase for the sentences, we used back translation from Armenian to English. We repeated the step twice, after which the generated paraphrases were manually reviewed. Invalid sentences were filtered out, while the rest were labelled as either paraphrase, near paraphrase or non-paraphrase. Test examples were reviewed by 3 different annotators. In addition, to increase the number of non-paraphrase pairs, we padded the dataset with automatically generated negative examples, including pairs of consecutive sentences and random pairs. ## Dataset Structure Each row consists of 2 sentences and their label. This sentences were labelled as either paraphrase, near paraphrase or non-paraphrase (with 1, 0, -1 labels respectively). The sentences are divided into train and test sets. |Number of examples|Total|Paraphrase|Non-paraphrase (near paraphrase)| |:-- | :---: | :---: | :---: | |Train | 4233 |1339 |2683 (211) | |Test | 1682 |1021 |448 (213) | ### Dataset Evaluation We finetuned Multilingual BERT on several training sets, including the proposed ARPA dataset, and evaluated their performance on our test set. During training and evaluation, near paraphrase and non-paraphrase pairs were combined into one class. The results are provided below: |BERT Model | Train set | F1 | Acc. | |:-- | :---: | :---: | :---: | |Multilingual BERT | ARPA train set| 84.27| 78.06| |Multilingual BERT | Paraphraser.ru train set machine-translated into Armenian | 83.81 | 77.09 | |Multilingual BERT | MRPC train set machine-translated into Armenian | 80.07 | 69.87 | |Multilingual BERT | All of the above combined | 84 |77.6 | #### Additional Information The model trained on ARPA is available for use, and can be downloaded using this [link](https://drive.google.com/uc?id=14owW5kkZ1JiNa6P-676e-QIj8m8i5e_8). For more details about the models and dataset construction, refer to the [paper](https://arxiv.org/pdf/2009.12615).
botbot-ai/MetaMathQA-40K-PTBR
--- license: cc language: - pt pretty_name: MetaMathQA 40k PTBR --- Tradução do MetaMathQA-4k para portugues com NLLB 3.3b.
pbwinter/tokenized_masked_hindi_wiki
--- dataset_info: features: - name: labels sequence: int64 - name: input_ids sequence: int32 - name: token_type_ids sequence: int8 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 4224326912 num_examples: 2336464 download_size: 343947644 dataset_size: 4224326912 configs: - config_name: default data_files: - split: train path: data/train-* ---