datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Jerry-Master/lung-tumour-study | ---
license: cc-by-nc-4.0
---
# Combining graph neural networks and computer vision methods for cell nuclei classification in lung tissue
This is the dataset of the article in the title. It contains 85 patches of 1024x1024 pixels from H&E stained WSIs of 9 different patients. It contains two main classes: tumoural (2) and non tumoural (1). Due to the difficulty of the problem, 153 cells were labelled as uncertain. For technical reasons, we decided to eliminate them in the train and validation set and we carefully chose the test set so that it included no uncertain cell. In total there are 21255 cells in the train set, 4114 in the validation set and 5533 in the test set. We manually reviewed that no patient is in two splits at the same time, ensuring that the split has no data leakage in any way.
This repo is just a copy of [https://zenodo.org/doi/10.5281/zenodo.8368122](https://zenodo.org/doi/10.5281/zenodo.8368122).
## Structure
The data is provided in several ways. In the orig folder you have the images without any annotation. Later in overlay the same images with the cells overlayed on top are provided for visualization purposes being red healthy cells and green the tumoural ones. Annotations were made using a software called QuPath, the raw geojson files extracted from the application are in raw_geojson. However, bear in mind that it may contain duplicated cells and uncertain cells. We are releasing it together with the scripts in the scripts folder so that any interested researcher can load the annotations back into QuPath and review the labels. If you, as an expert, believe we have incorrectly labelled some cells, please, feel free to contact us. The rest of the folders (train, test, validation) contain the data ready to use and with the same structure as specified in the [tumourkit package documentation](https://lung-tumour-study.readthedocs.io/en/latest/usage.html#make-dirs). Just move them into the data folder. Notice you will need to move the orig folder too.
Any pred or hov folder is provided as an example. They contain predictions from one of our models. If you were to train your own models, you should delete them. Also, the npy folders are crops of the original images of size 518x518. You can train Hovernet with other shapes if you want by modifying the code provided by the [Tumourkit library](https://github.com/Jerry-Master/lung-tumour-study).
# Citation
```
@article{PerezCano2024,
author = {Jose Pérez-Cano and Irene Sansano Valero and David Anglada-Rotger and Oscar Pina and Philippe Salembier and Ferran Marques},
title = {Combining graph neural networks and computer vision methods for cell nuclei classification in lung tissue},
journal = {Heliyon},
year = {2024},
volume = {10},
number = {7},
doi = {10.1016/j.heliyon.2024.e28463},
}
``` |
mac326/test | ---
license: openrail
---
|
NarchAI1992/milimetvuong | ---
license: openrail
---
|
sozercan/k8s-instructions | ---
license: apache-2.0
---
This is a fork from https://huggingface.co/datasets/substratusai/k8s-instructions |
open-llm-leaderboard/details_nbeerbower__flammen11-mistral-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen11-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen11-mistral-7B](https://huggingface.co/nbeerbower/flammen11-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen11-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T16:18:32.327853](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen11-mistral-7B/blob/main/results_2024-03-24T16-18-32.327853.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557201274281064,\n\
\ \"acc_stderr\": 0.03213686194120598,\n \"acc_norm\": 0.6555466414649391,\n\
\ \"acc_norm_stderr\": 0.03280266988592698,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7173241167460739,\n\
\ \"mc2_stderr\": 0.014561802998456887\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520762\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7033459470225055,\n\
\ \"acc_stderr\": 0.004558491550673701,\n \"acc_norm\": 0.880601473809998,\n\
\ \"acc_norm_stderr\": 0.0032359418109431525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496717,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623227,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7173241167460739,\n\
\ \"mc2_stderr\": 0.014561802998456887\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.01075935201485594\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515424\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen11-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|arc:challenge|25_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|gsm8k|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hellaswag|10_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-18-32.327853.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T16-18-32.327853.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- '**/details_harness|winogrande|5_2024-03-24T16-18-32.327853.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T16-18-32.327853.parquet'
- config_name: results
data_files:
- split: 2024_03_24T16_18_32.327853
path:
- results_2024-03-24T16-18-32.327853.parquet
- split: latest
path:
- results_2024-03-24T16-18-32.327853.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen11-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen11-mistral-7B](https://huggingface.co/nbeerbower/flammen11-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen11-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T16:18:32.327853](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen11-mistral-7B/blob/main/results_2024-03-24T16-18-32.327853.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557201274281064,
"acc_stderr": 0.03213686194120598,
"acc_norm": 0.6555466414649391,
"acc_norm_stderr": 0.03280266988592698,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7173241167460739,
"mc2_stderr": 0.014561802998456887
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520762
},
"harness|hellaswag|10": {
"acc": 0.7033459470225055,
"acc_stderr": 0.004558491550673701,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.0032359418109431525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496717,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7173241167460739,
"mc2_stderr": 0.014561802998456887
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.01075935201485594
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lk123/msmarco_hn | ---
license: apache-2.0
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
splits:
- name: train
num_bytes: 4795960731
num_examples: 485823
download_size: 2660748125
dataset_size: 4795960731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ericbear0602/dataset | ---
license: mit
---
|
heliosprime/twitter_dataset_1713155733 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6022
num_examples: 15
download_size: 11306
dataset_size: 6022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713155733"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sadanalog/NLLB3.3B_XQuAD_TH_sent_span | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: text
sequence: string
splits:
- name: train
num_bytes: 2570720
num_examples: 1190
download_size: 503670
dataset_size: 2570720
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "NLLB3.3B_XQuAD_TH_sent_span"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Caltech101_with_background_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 113162423.0
num_examples: 6084
download_size: 116470550
dataset_size: 113162423.0
---
# Dataset Card for "Caltech101_with_background_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sabarzii/NLP_summery_Books | ---
dataset_info:
features:
- name: crime
dtype: string
- name: romance
dtype: string
- name: psychology
dtype: string
splits:
- name: train
num_bytes: 254995038
num_examples: 2679
download_size: 154168098
dataset_size: 254995038
---
# Dataset Card for "NLP_summery_Books"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712991790 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10520
num_examples: 23
download_size: 9376
dataset_size: 10520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712991790"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vibha-mah/Bat-Classification | ---
task_categories:
- audio-classification
language:
- en
tags:
- biology
- medical
- science
- bats
pretty_name: Bat Classification in Europe
--- |
ChuGyouk/openorca_t0_filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3666271416.621207
num_examples: 2149573
download_size: 2463718307
dataset_size: 3666271416.621207
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Codec-SUPERB/gtzan_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 48069680
num_examples: 1000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 48069680
num_examples: 1000
- name: academicodec_hifi_24k_320d
num_bytes: 72069680
num_examples: 1000
- name: audiodec_24k_320d
num_bytes: 153685680
num_examples: 1000
- name: dac_16k
num_bytes: 157349680
num_examples: 1000
- name: dac_24k
num_bytes: 643509680
num_examples: 1000
- name: dac_44k
num_bytes: 209753680
num_examples: 1000
- name: encodec_24k_12bps
num_bytes: 288117680
num_examples: 1000
- name: encodec_24k_1_5bps
num_bytes: 36061680
num_examples: 1000
- name: encodec_24k_24bps
num_bytes: 576181680
num_examples: 1000
- name: encodec_24k_3bps
num_bytes: 72069680
num_examples: 1000
- name: encodec_24k_6bps
num_bytes: 144085680
num_examples: 1000
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 384181680
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 192181680
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 384181680
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 192181680
num_examples: 1000
- name: speech_tokenizer_16k
num_bytes: 96085680
num_examples: 1000
download_size: 697459099
dataset_size: 4466711920
---
# Dataset Card for "gtzan_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
matallanas/ignatius | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 5743920
num_examples: 28
download_size: 5743956
dataset_size: 5743920
license: openrail
task_categories:
- text-to-image
---
# Dataset Card for "ignatius"
This dataset was created to participate in the keras dreambooth sprint. It is based on the Spanish comedian [Ignatius Farray](https://es.wikipedia.org/wiki/Ignatius_Farray)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trondizzy/para_legal | ---
license: cc
task_categories:
- translation
language:
- uk
- en
size_categories:
- n<1K
--- |
hazyresearch/based-fda | ---
language:
- en
dataset_info:
features:
- name: doc_id
dtype: string
- name: file_name
dtype: string
- name: key
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: validation
num_bytes: 8498008
num_examples: 1102
download_size: 1381388
dataset_size: 8498008
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
task_categories:
- question-answering
- feature-extraction
--- |
CVasNLPExperiments/docvqa_test_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 830
num_examples: 10
download_size: 3053
dataset_size: 830
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
tgsc/c4-pt-randMore35M-part04-deduplicated-128000-no-digit-split-mask-train-15003771-lines | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 53736539138
num_examples: 15003771
download_size: 24401176899
dataset_size: 53736539138
---
# Dataset Card for "c4-pt-randMore35M-part04-deduplicated-128000-no-digit-split-mask-train-15003771-lines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Paulitos/school-math-questions-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 536113
num_examples: 1000
download_size: 269362
dataset_size: 536113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheDuyx/augmented_bass_data | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': '808'
'1': acid
'2': brass
'3': growl
'4': jump_up
'5': reese
'6': slap
'7': sub
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3138542504
num_examples: 34408
- name: test
num_bytes: 346665208
num_examples: 3824
download_size: 1752534006
dataset_size: 3485207712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-6fbfec76-7855039 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: santiviquez/bart-base-finetuned-samsum-en
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: santiviquez/bart-base-finetuned-samsum-en
* Dataset: samsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Ti-Ma/wikipedia_2022 | ---
license: cc-by-sa-3.0
---
|
bigscience/xP3all | ---
annotations_creators:
- expert-generated
- crowdsourced
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
license:
- apache-2.0
multilinguality:
- multilingual
pretty_name: xP3
size_categories:
- 100M<n<1B
task_categories:
- other
---
# Dataset Card for xP3
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/bigscience-workshop/xmtf
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
### Dataset Summary
> xP3 (Crosslingual Public Pool of Prompts) is a collection of prompts & datasets across 46 of languages & 16 NLP tasks. It is used for the training of BLOOMZ and mT0, multilingual language models capable of following human instructions in dozens of languages zero-shot.
- **Creation:** The dataset can be recreated using instructions available [here](https://github.com/bigscience-workshop/xmtf#create-xp3). We provide this version to save processing time and ease reproducibility.
- **Languages:** 46 (Can be extended by [recreating with more splits](https://github.com/bigscience-workshop/xmtf#create-xp3))
- **xP3 Dataset Family:**
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
<th>Example models</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/xP3x>xP3x</a></t>
<td>Mixture of 17 tasks in 277 languages with English prompts</td>
<td>WIP - Join us at Project Aya @<a href=https://cohere.for.ai/>C4AI</a> to help!</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3>xP3</a></t>
<td>Mixture of 13 training tasks in 46 languages with English prompts</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a> & <a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a></t>
<td>Mixture of 13 training tasks in 46 languages with prompts in 20 languages (machine-translated from English)</td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3all>xP3all</a></t>
<td>xP3 + evaluation datasets adding an additional 3 tasks for a total of 16 tasks in 46 languages with English prompts</td>
<td></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3megds>xP3megds</a></t>
<td><a href=https://github.com/bigscience-workshop/Megatron-DeepSpeed>Megatron-DeepSpeed</a> processed version of xP3</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/P3>P3</a></t>
<td>Repreprocessed version of the English-only <a href=https://huggingface.co/datasets/bigscience/P3>P3</a> with 8 training tasks</td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
</tr>
</table>
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```json
{
"inputs": "Sentence 1: Fue académico en literatura metafísica, teología y ciencias clásicas.\nSentence 2: Fue académico en literatura metafísica, teología y ciencia clásica.\nQuestion: Can we rewrite Sentence 1 to Sentence 2? Yes or No?",
"targets": "Yes"
}
```
### Data Fields
The data fields are the same among all splits:
- `inputs`: the natural language input fed to the model
- `targets`: the natural language target that the model has to generate
### Data Splits
The below table summarizes sizes per language (computed from the `merged_{lang}.jsonl` files). Due to languages like `tw` only being single sentence translation samples from Flores, their byte percentage is significantly lower than their sample percentage.
|Language|Kilobytes|%|Samples|%|
|--------|------:|-:|---:|-:|
|tw|106288|0.11|265071|0.33|
|bm|107056|0.11|265180|0.33|
|ak|108096|0.11|265071|0.33|
|ca|110608|0.11|271191|0.33|
|eu|113008|0.11|281199|0.35|
|fon|113072|0.11|265063|0.33|
|st|114080|0.11|265063|0.33|
|ki|115040|0.12|265180|0.33|
|tum|116032|0.12|265063|0.33|
|wo|122560|0.12|365063|0.45|
|ln|126304|0.13|365060|0.45|
|as|156256|0.16|265063|0.33|
|or|161472|0.16|265063|0.33|
|kn|165456|0.17|265063|0.33|
|ml|175040|0.18|265864|0.33|
|rn|192992|0.19|318189|0.39|
|nso|229712|0.23|915051|1.13|
|tn|235536|0.24|915054|1.13|
|lg|235936|0.24|915021|1.13|
|rw|249360|0.25|915043|1.13|
|ts|250256|0.25|915044|1.13|
|sn|252496|0.25|865056|1.07|
|xh|254672|0.26|915058|1.13|
|zu|263712|0.26|915061|1.13|
|ny|272128|0.27|915063|1.13|
|ig|325232|0.33|950097|1.17|
|yo|352784|0.35|918416|1.13|
|ne|393680|0.39|315754|0.39|
|pa|523248|0.52|339210|0.42|
|gu|560688|0.56|347499|0.43|
|sw|566656|0.57|1130481|1.4|
|mr|666240|0.67|417269|0.52|
|bn|832720|0.83|428843|0.53|
|ta|926912|0.93|415433|0.51|
|te|1343232|1.35|584590|0.72|
|ur|1918272|1.92|855756|1.06|
|vi|3102512|3.11|1672106|2.07|
|code|4330752|4.34|2707724|3.34|
|hi|4403568|4.41|1554667|1.92|
|zh|4599440|4.61|3589234|4.43|
|id|4612256|4.62|2643418|3.27|
|ar|4683456|4.69|2160181|2.67|
|fr|6591120|6.6|5316403|6.57|
|pt|6886800|6.9|3752156|4.63|
|es|8587920|8.6|5413205|6.69|
|en|39252528|39.33|32740750|40.44|
|total|99807184|100.0|80956089|100.0|
## Dataset Creation
### Source Data
#### Training datasets
- Code Miscellaneous
- [CodeComplex](https://huggingface.co/datasets/codeparrot/codecomplex)
- [Docstring Corpus](https://huggingface.co/datasets/teven/code_docstring_corpus)
- [GreatCode](https://huggingface.co/datasets/great_code)
- [State Changes](https://huggingface.co/datasets/Fraser/python-state-changes)
- Closed-book QA
- [Hotpot QA](https://huggingface.co/datasets/hotpot_qa)
- [Trivia QA](https://huggingface.co/datasets/trivia_qa)
- [Web Questions](https://huggingface.co/datasets/web_questions)
- [Wiki QA](https://huggingface.co/datasets/wiki_qa)
- Extractive QA
- [Adversarial QA](https://huggingface.co/datasets/adversarial_qa)
- [CMRC2018](https://huggingface.co/datasets/cmrc2018)
- [DRCD](https://huggingface.co/datasets/clue)
- [DuoRC](https://huggingface.co/datasets/duorc)
- [MLQA](https://huggingface.co/datasets/mlqa)
- [Quoref](https://huggingface.co/datasets/quoref)
- [ReCoRD](https://huggingface.co/datasets/super_glue)
- [ROPES](https://huggingface.co/datasets/ropes)
- [SQuAD v2](https://huggingface.co/datasets/squad_v2)
- [xQuAD](https://huggingface.co/datasets/xquad)
- TyDI QA
- [Primary](https://huggingface.co/datasets/khalidalt/tydiqa-primary)
- [Goldp](https://huggingface.co/datasets/khalidalt/tydiqa-goldp)
- Multiple-Choice QA
- [ARC](https://huggingface.co/datasets/ai2_arc)
- [C3](https://huggingface.co/datasets/c3)
- [CoS-E](https://huggingface.co/datasets/cos_e)
- [Cosmos](https://huggingface.co/datasets/cosmos)
- [DREAM](https://huggingface.co/datasets/dream)
- [MultiRC](https://huggingface.co/datasets/super_glue)
- [OpenBookQA](https://huggingface.co/datasets/openbookqa)
- [PiQA](https://huggingface.co/datasets/piqa)
- [QUAIL](https://huggingface.co/datasets/quail)
- [QuaRel](https://huggingface.co/datasets/quarel)
- [QuaRTz](https://huggingface.co/datasets/quartz)
- [QASC](https://huggingface.co/datasets/qasc)
- [RACE](https://huggingface.co/datasets/race)
- [SciQ](https://huggingface.co/datasets/sciq)
- [Social IQA](https://huggingface.co/datasets/social_i_qa)
- [Wiki Hop](https://huggingface.co/datasets/wiki_hop)
- [WiQA](https://huggingface.co/datasets/wiqa)
- Paraphrase Identification
- [MRPC](https://huggingface.co/datasets/super_glue)
- [PAWS](https://huggingface.co/datasets/paws)
- [PAWS-X](https://huggingface.co/datasets/paws-x)
- [QQP](https://huggingface.co/datasets/qqp)
- Program Synthesis
- [APPS](https://huggingface.co/datasets/codeparrot/apps)
- [CodeContests](https://huggingface.co/datasets/teven/code_contests)
- [JupyterCodePairs](https://huggingface.co/datasets/codeparrot/github-jupyter-text-code-pairs)
- [MBPP](https://huggingface.co/datasets/Muennighoff/mbpp)
- [NeuralCodeSearch](https://huggingface.co/datasets/neural_code_search)
- [XLCoST](https://huggingface.co/datasets/codeparrot/xlcost-text-to-code)
- Structure-to-text
- [Common Gen](https://huggingface.co/datasets/common_gen)
- [Wiki Bio](https://huggingface.co/datasets/wiki_bio)
- Sentiment
- [Amazon](https://huggingface.co/datasets/amazon_polarity)
- [App Reviews](https://huggingface.co/datasets/app_reviews)
- [IMDB](https://huggingface.co/datasets/imdb)
- [Rotten Tomatoes](https://huggingface.co/datasets/rotten_tomatoes)
- [Yelp](https://huggingface.co/datasets/yelp_review_full)
- Simplification
- [BiSECT](https://huggingface.co/datasets/GEM/BiSECT)
- Summarization
- [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail)
- [Gigaword](https://huggingface.co/datasets/gigaword)
- [MultiNews](https://huggingface.co/datasets/multi_news)
- [SamSum](https://huggingface.co/datasets/samsum)
- [Wiki-Lingua](https://huggingface.co/datasets/GEM/wiki_lingua)
- [XLSum](https://huggingface.co/datasets/GEM/xlsum)
- [XSum](https://huggingface.co/datasets/xsum)
- Topic Classification
- [AG News](https://huggingface.co/datasets/ag_news)
- [DBPedia](https://huggingface.co/datasets/dbpedia_14)
- [TNEWS](https://huggingface.co/datasets/clue)
- [TREC](https://huggingface.co/datasets/trec)
- [CSL](https://huggingface.co/datasets/clue)
- Translation
- [Flores-200](https://huggingface.co/datasets/Muennighoff/flores200)
- [Tatoeba](https://huggingface.co/datasets/Helsinki-NLP/tatoeba_mt)
- Word Sense disambiguation
- [WiC](https://huggingface.co/datasets/super_glue)
- [XL-WiC](https://huggingface.co/datasets/pasinit/xlwic)
#### Evaluation datasets (included in [xP3all](https://huggingface.co/datasets/bigscience/xP3all) except for HumanEval)
- Natural Language Inference
- [ANLI](https://huggingface.co/datasets/anli)
- [CB](https://huggingface.co/datasets/super_glue)
- [RTE](https://huggingface.co/datasets/super_glue)
- [XNLI](https://huggingface.co/datasets/xnli)
- Coreference Resolution
- [Winogrande](https://huggingface.co/datasets/winogrande)
- [XWinograd](https://huggingface.co/datasets/Muennighoff/xwinograd)
- Program Synthesis
- [HumanEval](https://huggingface.co/datasets/openai_humaneval)
- Sentence Completion
- [COPA](https://huggingface.co/datasets/super_glue)
- [Story Cloze](https://huggingface.co/datasets/story_cloze)
- [XCOPA](https://huggingface.co/datasets/xcopa)
- [XStoryCloze](https://huggingface.co/datasets/Muennighoff/xstory_cloze)
#### Additional [xP3all](https://huggingface.co/datasets/bigscience/xP3all) datasets
- Coreference Resolution
- [WSC (Fixed)](https://huggingface.co/datasets/super_glue)
- Sentence Completion
- [HellaSwag](https://huggingface.co/datasets/hellaswag)
- Translation
- [MultiEurlex](https://huggingface.co/datasets/multi_eurlex)
## Additional Information
### Licensing Information
The dataset is released under Apache 2.0.
### Citation Information
```bibtex
@misc{muennighoff2022crosslingual,
title={Crosslingual Generalization through Multitask Finetuning},
author={Niklas Muennighoff and Thomas Wang and Lintang Sutawika and Adam Roberts and Stella Biderman and Teven Le Scao and M Saiful Bari and Sheng Shen and Zheng-Xin Yong and Hailey Schoelkopf and Xiangru Tang and Dragomir Radev and Alham Fikri Aji and Khalid Almubarak and Samuel Albanie and Zaid Alyafeai and Albert Webson and Edward Raff and Colin Raffel},
year={2022},
eprint={2211.01786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to the contributors of [promptsource](https://github.com/bigscience-workshop/promptsource/graphs/contributors) for adding many prompts used in this dataset. |
AdapterOcean/pythonbook-standardized_embedded | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 18787420
num_examples: 2574
download_size: 0
dataset_size: 18787420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pythonbook-standardized_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Babelscape/REDFM | ---
dataset_info:
- config_name: ar
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: test
num_bytes: 521806
num_examples: 345
- name: validation
num_bytes: 577499
num_examples: 385
download_size: 3458539
dataset_size: 1099305
- config_name: de
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 2455615
num_examples: 2071
- name: test
num_bytes: 334212
num_examples: 285
- name: validation
num_bytes: 310862
num_examples: 252
download_size: 8072481
dataset_size: 3100689
- config_name: en
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 4387657
num_examples: 2878
- name: test
num_bytes: 654376
num_examples: 446
- name: validation
num_bytes: 617141
num_examples: 449
download_size: 13616716
dataset_size: 5659174
- config_name: es
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 2452744
num_examples: 1866
- name: test
num_bytes: 345782
num_examples: 281
- name: validation
num_bytes: 299692
num_examples: 228
download_size: 7825400
dataset_size: 3098218
- config_name: fr
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 2280992
num_examples: 1865
- name: test
num_bytes: 427990
num_examples: 415
- name: validation
num_bytes: 429165
num_examples: 416
download_size: 8257363
dataset_size: 3138147
- config_name: it
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 1918310
num_examples: 1657
- name: test
num_bytes: 489445
num_examples: 509
- name: validation
num_bytes: 485557
num_examples: 521
download_size: 7537265
dataset_size: 2893312
- config_name: zh
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: test
num_bytes: 311905
num_examples: 270
- name: validation
num_bytes: 364077
num_examples: 307
download_size: 1952982
dataset_size: 675982
- config_name: all_languages
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: lan
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: predicate
dtype:
class_label:
names:
'0': country
'1': place of birth
'2': spouse
'3': country of citizenship
'4': instance of
'5': capital
'6': child
'7': shares border with
'8': author
'9': director
'10': occupation
'11': founded by
'12': league
'13': owned by
'14': genre
'15': named after
'16': follows
'17': headquarters location
'18': cast member
'19': manufacturer
'20': located in or next to body of water
'21': location
'22': part of
'23': mouth of the watercourse
'24': member of
'25': sport
'26': characters
'27': participant
'28': notable work
'29': replaces
'30': sibling
'31': inception
- name: object
struct:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
splits:
- name: train
num_bytes: 13557340
num_examples: 10337
- name: test
num_bytes: 3100822
num_examples: 2551
- name: validation
num_bytes: 3099341
num_examples: 2558
download_size: 50720746
dataset_size: 19757503
task_categories:
- token-classification
language:
- ar
- de
- en
- es
- it
- fr
- zh
size_categories:
- 10K<n<100K
license: cc-by-sa-4.0
---
# RED<sup>FM</sup>: a Filtered and Multilingual Relation Extraction Dataset
This is the human-filtered dataset from the 2023 ACL paper [RED^{FM}: a Filtered and Multilingual Relation Extraction Dataset](https://arxiv.org/abs/2306.09802). If you use the model, please reference this work in your paper:
@inproceedings{huguet-cabot-et-al-2023-redfm-dataset,
title = "RED$^{\rm FM}$: a Filtered and Multilingual Relation Extraction Dataset",
author = "Huguet Cabot, Pere-Llu{\'\i}s and Tedeschi, Simone and Ngonga Ngomo, Axel-Cyrille and
Navigli, Roberto",
booktitle = "Proc. of the 61st Annual Meeting of the Association for Computational Linguistics: ACL 2023",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2306.09802",
}
## License
RED<sup>FM</sup> is licensed under the CC BY-SA 4.0 license. The text of the license can be found [here](https://creativecommons.org/licenses/by-sa/4.0/). |
jlbaker361/test_chosen_runner | ---
dataset_info:
features:
- name: label
dtype: string
- name: textual_inversion_prompt_similarity
dtype: float32
- name: textual_inversion_identity_consistency
dtype: float32
- name: textual_inversion_negative_prompt_similarity
dtype: float32
- name: textual_inversion_target_prompt_similarity
dtype: float32
- name: unet_lora_prompt_similarity
dtype: float32
- name: unet_lora_identity_consistency
dtype: float32
- name: unet_lora_negative_prompt_similarity
dtype: float32
- name: unet_lora_target_prompt_similarity
dtype: float32
splits:
- name: train
num_bytes: 1248
num_examples: 31
download_size: 7162
dataset_size: 1248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jbfreb/fashion_image_caption_100_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820374
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption_100_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Freakscode/Animals | ---
license: other
---
|
bilalahmadai/open_assistant_dataset_llama2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 303916
num_examples: 700
- name: validation
num_bytes: 176400
num_examples: 300
download_size: 179286
dataset_size: 480316
---
# Dataset Card for "open_assistant_dataset_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EricAntoie/VF_Def | ---
license: gpl
---
|
deetsadi/processed_dwi_cropped_rgb | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 11845241.0
num_examples: 200
download_size: 11613007
dataset_size: 11845241.0
---
# Dataset Card for "processed_dwi_cropped_rgb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_269 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21010211856.625
num_examples: 218747
download_size: 19104304478
dataset_size: 21010211856.625
---
# Dataset Card for "chunk_269"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uyentk/thucuc_data | ---
dataset_info:
- config_name: QA_data
features:
- name: quest_content
dtype: string
- name: text_ans
dtype: string
- name: url
dtype: string
- name: quest
dtype: string
splits:
- name: train
num_bytes: 3918114
num_examples: 1944
download_size: 1881638
dataset_size: 3918114
- config_name: default
features:
- name: text
dtype: string
- name: metadata
struct:
- name: desc
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 54612520
num_examples: 6735
download_size: 15710416
dataset_size: 54612520
- config_name: full_qa
features:
- name: metadata
struct:
- name: url
dtype: string
- name: quest
dtype: string
- name: quest_content
dtype: string
- name: text_ans
dtype: string
splits:
- name: train
num_bytes: 3677460
num_examples: 1944
download_size: 1759570
dataset_size: 3677460
- config_name: news6
features:
- name: text_ans
dtype: string
- name: metadata
struct:
- name: quest
dtype: string
- name: url
dtype: string
- name: quest_content
dtype: string
splits:
- name: train
num_bytes: 71898
num_examples: 44
download_size: 49839
dataset_size: 71898
- config_name: news_data
features:
- name: type
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: desc
dtype: string
splits:
- name: train
num_bytes: 792249025
num_examples: 114323
download_size: 257336967
dataset_size: 792249025
configs:
- config_name: QA_data
data_files:
- split: train
path: QA_data/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: full_qa
data_files:
- split: train
path: full_qa/train-*
- config_name: news6
data_files:
- split: train
path: news6/train-*
- config_name: news_data
data_files:
- split: train
path: news_data/train-*
---
|
nikniksen/TMJIT_v2 | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 11341
num_examples: 8
- name: test
num_bytes: 12887
num_examples: 9
download_size: 42040
dataset_size: 24228
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
chujiezheng/glove_embedding | ---
license: apache-2.0
language:
- en
---
Embedding similarity calculation files for the ACL 2021 paper "Towards Emotional Support Dialog Systems"
[GitHub repo](https://github.com/thu-coai/Emotional-Support-Conversation). [Original paper](https://arxiv.org/abs/2106.01144).
```bib
@inproceedings{liu-etal-2021-towards,
title={Towards Emotional Support Dialog Systems},
author={Liu, Siyang and
Zheng, Chujie and
Demasi, Orianna and
Sabour, Sahand and
Li, Yu and
Yu, Zhou and
Jiang, Yong and
Huang, Minlie},
booktitle={ACL},
year={2021}
}
```
|
ctu-aic/qacg-pl | ---
dataset_info:
- config_name: balanced
features:
- name: claim
dtype: string
- name: label
dtype: string
- name: evidence
sequence: string
splits:
- name: train
num_bytes: 28840978
num_examples: 295209
- name: validation
num_bytes: 2999469
num_examples: 30087
- name: test
num_bytes: 2794136
num_examples: 28440
download_size: 23940163
dataset_size: 34634583
- config_name: balanced_shuf
features:
- name: claim
dtype: string
- name: label
dtype: string
- name: evidence
sequence: string
splits:
- name: train
num_bytes: 17796423
num_examples: 183204
- name: validation
num_bytes: 1843397
num_examples: 18685
- name: test
num_bytes: 1723848
num_examples: 17731
download_size: 14541050
dataset_size: 21363668
configs:
- config_name: balanced
data_files:
- split: train
path: balanced/train-*
- split: validation
path: balanced/validation-*
- split: test
path: balanced/test-*
- config_name: balanced_shuf
data_files:
- split: train
path: balanced_shuf/train-*
- split: validation
path: balanced_shuf/validation-*
- split: test
path: balanced_shuf/test-*
---
|
open-llm-leaderboard/details_SC99__Mistral-7B-summ-ia3-tuned-8h | ---
pretty_name: Evaluation run of SC99/Mistral-7B-summ-ia3-tuned-8h
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC99/Mistral-7B-summ-ia3-tuned-8h](https://huggingface.co/SC99/Mistral-7B-summ-ia3-tuned-8h)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-summ-ia3-tuned-8h\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T13:30:16.956785](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-summ-ia3-tuned-8h/blob/main/results_2024-01-29T13-30-16.956785.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.598517503564931,\n\
\ \"acc_stderr\": 0.03329970966372362,\n \"acc_norm\": 0.60343791220307,\n\
\ \"acc_norm_stderr\": 0.03397979412812745,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422913,\n \"mc2\": 0.6830892289108447,\n\
\ \"mc2_stderr\": 0.015395499999839348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348897,\n\
\ \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.01424161420741405\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6754630551682932,\n\
\ \"acc_stderr\": 0.004672447046820004,\n \"acc_norm\": 0.8514240191196972,\n\
\ \"acc_norm_stderr\": 0.003549431247907358\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.02732754844795755,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.02732754844795755\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\
\ \"acc_stderr\": 0.015218109544410172,\n \"acc_norm\": 0.2927374301675978,\n\
\ \"acc_norm_stderr\": 0.015218109544410172\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630446,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630446\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422913,\n \"mc2\": 0.6830892289108447,\n\
\ \"mc2_stderr\": 0.015395499999839348\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025386\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3661865049279757,\n \
\ \"acc_stderr\": 0.013270100238748835\n }\n}\n```"
repo_url: https://huggingface.co/SC99/Mistral-7B-summ-ia3-tuned-8h
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|arc:challenge|25_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|gsm8k|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hellaswag|10_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T13-30-16.956785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T13-30-16.956785.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- '**/details_harness|winogrande|5_2024-01-29T13-30-16.956785.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T13-30-16.956785.parquet'
- config_name: results
data_files:
- split: 2024_01_29T13_30_16.956785
path:
- results_2024-01-29T13-30-16.956785.parquet
- split: latest
path:
- results_2024-01-29T13-30-16.956785.parquet
---
# Dataset Card for Evaluation run of SC99/Mistral-7B-summ-ia3-tuned-8h
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-summ-ia3-tuned-8h](https://huggingface.co/SC99/Mistral-7B-summ-ia3-tuned-8h) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-summ-ia3-tuned-8h",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T13:30:16.956785](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-summ-ia3-tuned-8h/blob/main/results_2024-01-29T13-30-16.956785.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.598517503564931,
"acc_stderr": 0.03329970966372362,
"acc_norm": 0.60343791220307,
"acc_norm_stderr": 0.03397979412812745,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422913,
"mc2": 0.6830892289108447,
"mc2_stderr": 0.015395499999839348
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348897,
"acc_norm": 0.6117747440273038,
"acc_norm_stderr": 0.01424161420741405
},
"harness|hellaswag|10": {
"acc": 0.6754630551682932,
"acc_stderr": 0.004672447046820004,
"acc_norm": 0.8514240191196972,
"acc_norm_stderr": 0.003549431247907358
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795755,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795755
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410172,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410172
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615697,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630446,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630446
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422913,
"mc2": 0.6830892289108447,
"mc2_stderr": 0.015395499999839348
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025386
},
"harness|gsm8k|5": {
"acc": 0.3661865049279757,
"acc_stderr": 0.013270100238748835
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
diwank/time-sensitive-qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: idx
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: targets
sequence: string
- name: paragraphs
list:
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 295226558
num_examples: 14681
- name: test
num_bytes: 64202578
num_examples: 3078
- name: validation
num_bytes: 63453245
num_examples: 3087
download_size: 74250897
dataset_size: 422882381
---
# Dataset Card for "time-sensitive-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ML4CO/TSPLIBOriDataset | ---
license: apache-2.0
---
|
CyberHarem/quercus_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of quercus/クエルクス/夏栎 (Arknights)
This is the dataset of quercus/クエルクス/夏栎 (Arknights), containing 125 images and their tags.
The core tags of this character are `animal_ears, long_hair, breasts, blonde_hair, yellow_eyes, large_breasts, tail, animal_ear_fluff, cat_ears, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 204.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quercus_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 125 | 176.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quercus_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 305 | 352.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quercus_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/quercus_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, cleavage_cutout, simple_background, white_background, black_gloves, shirt, bare_shoulders, leotard, upper_body, braid, cat_girl, covered_navel, cat_tail, closed_mouth, fur_trim, holding, sleeveless |
| 1 | 14 |  |  |  |  |  | blush, nipples, sweat, 1girl, hetero, 1boy, pussy, solo_focus, completely_nude, looking_at_viewer, vaginal, cum, open_mouth, penis, navel, spread_legs, anus, ass, bar_censor, collarbone, mosaic_censoring, pubic_hair, sex_from_behind, smile |
| 2 | 12 |  |  |  |  |  | 1girl, blush, horse_ears, horse_girl, solo, horse_tail, looking_at_viewer, nipples, smile, black_thighhighs, blue_eyes, black_headwear, hat, white_cape, censored, open_mouth, pussy, sweat, thick_eyebrows, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | cleavage_cutout | simple_background | white_background | black_gloves | shirt | bare_shoulders | leotard | upper_body | braid | cat_girl | covered_navel | cat_tail | closed_mouth | fur_trim | holding | sleeveless | blush | nipples | sweat | hetero | 1boy | pussy | solo_focus | completely_nude | vaginal | cum | open_mouth | penis | navel | spread_legs | anus | ass | bar_censor | collarbone | mosaic_censoring | pubic_hair | sex_from_behind | horse_ears | horse_girl | horse_tail | black_thighhighs | blue_eyes | black_headwear | hat | white_cape | censored | thick_eyebrows | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:------------------|:--------------------|:-------------------|:---------------|:--------|:-----------------|:----------|:-------------|:--------|:-----------|:----------------|:-----------|:---------------|:-----------|:----------|:-------------|:--------|:----------|:--------|:---------|:-------|:--------|:-------------|:------------------|:----------|:------|:-------------|:--------|:--------|:--------------|:-------|:------|:-------------|:-------------|:-------------------|:-------------|:------------------|:-------------|:-------------|:-------------|:-------------------|:------------|:-----------------|:------|:-------------|:-----------|:-----------------|:---------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | | | X | X | X | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
stevied67/autotrain-data-pegasus-subreddit-comments-summarizer | ---
language:
- en
task_categories:
- summarization
---
# AutoTrain Dataset for project: pegasus-subreddit-comments-summarizer
## Dataset Description
This dataset has been automatically processed by AutoTrain for project pegasus-subreddit-comments-summarizer.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "I go through this every single year. We have an Ironman competition that is 2 miles from my hotel, and I sell out for that weekend almost a year in advance. Without fail I will have some nitwit who will come up on their checkout day and ask to extend, when I tell them I can't they lose their mind at me. It's their room, they paid for it, they're already in there how can I just give it away. People do not understand how reservations work.",
"target": "The commenter experiences this every year - they sell out their hotel almost a year in advance for an Ironman competition nearby. Despite this, some customers still ask to extend their stay at checkout and get angry when told it's not possible because they don't understand how reservations work."
},
{
"text": "Can i just say .. thanks for going back to make sure you hadn't overreacted. Im sure that made things so much easier on all the staff, with it being their first days back, being understaffed, I'm sure, and trying to get back into the swing of things. I think you handled that really well :)",
"target": "The commenter appreciates the poster's effort in going back to verify if they had overreacted. The commenter believes this action might have made things easier for the understaffed team during their first days back. The commenter commends the poster for handling the situation well."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 7177 |
| valid | 1796 |
|
imdatta0/instruct_v3_formatted | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 438820206.36603343
num_examples: 55917
- name: test
num_bytes: 1961926.633966564
num_examples: 250
download_size: 253236447
dataset_size: 440782133.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_cola_who_what | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4114
num_examples: 47
- name: test
num_bytes: 2873
num_examples: 34
- name: train
num_bytes: 41650
num_examples: 457
download_size: 28365
dataset_size: 48637
---
# Dataset Card for "MULTI_VALUE_cola_who_what"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imperialwarrior/open-australian-legal-qa-paraphrased-easy-gpt-with-emb | ---
dataset_info:
features:
- name: pipeline_1_result
dtype: string
- name: pipeline_1_result_r_embeddings
sequence: float64
- name: pipeline_1_result_nr_embeddings
sequence: float64
- name: pipeline_2_context
dtype: string
- name: pipeline_2_result
dtype: string
- name: pipeline_2_result_r_embeddings
sequence: float64
- name: pipeline_2_result_nr_embeddings
sequence: float64
- name: pipeline_3_context
dtype: string
- name: pipeline_3_result
dtype: string
- name: pipeline_3_result_r_embeddings
sequence: float64
- name: pipeline_3_result_nr_embeddings
sequence: float64
- name: pipeline_4_context
dtype: string
- name: pipeline_4_result
dtype: string
- name: pipeline_4_result_r_embeddings
sequence: float64
- name: pipeline_4_result_nr_embeddings
sequence: float64
- name: pipeline_5_context
dtype: string
- name: pipeline_5_result
dtype: string
- name: pipeline_5_result_r_embeddings
sequence: float64
- name: pipeline_5_result_nr_embeddings
sequence: float64
- name: pipeline_6_context
dtype: string
- name: pipeline_6_result
dtype: string
- name: pipeline_6_result_r_embeddings
sequence: float64
- name: pipeline_6_result_nr_embeddings
sequence: float64
- name: pipeline_7_context
dtype: string
- name: pipeline_7_result
dtype: string
- name: pipeline_7_result_r_embeddings
sequence: float64
- name: pipeline_7_result_nr_embeddings
sequence: float64
- name: referenced_question
dtype: string
- name: answer
dtype: string
- name: answer_non_retrieval_embeddings
dtype: string
- name: answer_retrieval_embeddings
dtype: string
- name: question
dtype: string
- name: question_retrieval_embeddings
dtype: string
- name: question_non_retrieval_embeddings
dtype: string
- name: __index_level_0__
dtype: float64
- name: case_index
dtype: float64
- name: pipeline_6_case_indexes
sequence: int64
- name: pipeline_7_case_indexes
sequence: int64
splits:
- name: train
num_bytes: 137944644
num_examples: 208
download_size: 32779364
dataset_size: 137944644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_78_1713146783 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 573363
num_examples: 1368
download_size: 287620
dataset_size: 573363
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Moby/botw_dish | ---
license: unknown
---
|
TheFinAI/flare-mlesg | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 926136
num_examples: 300
download_size: 228133
dataset_size: 926136
---
# Dataset Card for "flare-mlesg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UKPLab/SLTrans | ---
license: cc-by-nc-sa-4.0
tags:
- code
extra_gated_prompt: >-
You agree to not use the model to conduct experiments that cause harm to human
subjects or generate malicious code.
extra_gated_fields:
Company: text
Country: country
Specific date: date_picker
I want to use this model for:
type: select
options:
- Research
- Education
- label: Other
value: other
I agree to use this model for non-commercial use ONLY: checkbox
task_categories:
- text-generation
size_categories:
- 1M<n<10M
dataset_info:
- config_name: C
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 3383884149
num_examples: 341419
- name: Size_Optimized
num_bytes: 2528286566
num_examples: 341785
download_size: 1323447636
dataset_size: 5912170715
- config_name: C++
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 116351369851
num_examples: 2898509
- name: Size_Optimized
num_bytes: 92572469724
num_examples: 2916655
download_size: 51690627847
dataset_size: 208923839575
- config_name: D
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 2320830137
num_examples: 7000
- name: Size_Optimized
num_bytes: 3271276765
num_examples: 11054
download_size: 1316382832
dataset_size: 5592106902
- config_name: Fortran
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 357741835
num_examples: 6327
- name: Size_Optimized
num_bytes: 2320830137
num_examples: 7000
download_size: 563853972
dataset_size: 2678571972
- config_name: Go
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 819560767
num_examples: 3913
- name: Size_Optimized
num_bytes: 741733997
num_examples: 3925
download_size: 317182680
dataset_size: 1561294764
- config_name: Haskell
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 3838556743
num_examples: 27892
- name: Size_Optimized
num_bytes: 3667186152
num_examples: 28203
download_size: 1736729352
dataset_size: 7505742895
- config_name: Nim
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Size_Optimized
num_bytes: 106424381
num_examples: 215
download_size: 22506456
dataset_size: 106424381
- config_name: Objective-C
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 1729045
num_examples: 283
- name: Size_Optimized
num_bytes: 1433377
num_examples: 283
download_size: 707508
dataset_size: 3162422
- config_name: Python
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 13118428652
num_examples: 154507
- name: Size_Optimized
num_bytes: 13118428652
num_examples: 154507
download_size: 6511950536
dataset_size: 26236857304
- config_name: Rust
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 5859467468
num_examples: 38323
- name: Size_Optimized
num_bytes: 8695405064
num_examples: 32720
download_size: 5326634011
dataset_size: 14554872532
- config_name: Swift
features:
- name: Source_Code
dtype: string
- name: IR_Original
dtype: string
splits:
- name: Perf_Optimized
num_bytes: 260013963
num_examples: 2003
- name: Size_Optimized
num_bytes: 266356839
num_examples: 2015
download_size: 144113584
dataset_size: 526370802
configs:
- config_name: C
data_files:
- split: Perf_Optimized
path: C/Perf_Optimized-*
- split: Size_Optimized
path: C/Size_Optimized-*
- config_name: C++
data_files:
- split: Perf_Optimized
path: C++/Perf_Optimized-*
- split: Size_Optimized
path: C++/Size_Optimized-*
- config_name: D
data_files:
- split: Perf_Optimized
path: D/Perf_Optimized-*
- split: Size_Optimized
path: D/Size_Optimized-*
- config_name: Fortran
data_files:
- split: Perf_Optimized
path: Fortran/Perf_Optimized-*
- split: Size_Optimized
path: Fortran/Size_Optimized-*
- config_name: Go
data_files:
- split: Perf_Optimized
path: Go/Perf_Optimized-*
- split: Size_Optimized
path: Go/Size_Optimized-*
- config_name: Haskell
data_files:
- split: Perf_Optimized
path: Haskell/Perf_Optimized-*
- split: Size_Optimized
path: Haskell/Size_Optimized-*
- config_name: Nim
data_files:
- split: Size_Optimized
path: Nim/Size_Optimized-*
- config_name: Objective-C
data_files:
- split: Perf_Optimized
path: Objective-C/Perf_Optimized-*
- split: Size_Optimized
path: Objective-C/Size_Optimized-*
- config_name: Python
data_files:
- split: Perf_Optimized
path: Python/Perf_Optimized-*
- split: Size_Optimized
path: Python/Size_Optimized-*
- config_name: Rust
data_files:
- split: Perf_Optimized
path: Rust/Perf_Optimized-*
- split: Size_Optimized
path: Rust/Size_Optimized-*
- config_name: Swift
data_files:
- split: Perf_Optimized
path: Swift/Perf_Optimized-*
- split: Size_Optimized
path: Swift/Size_Optimized-*
---
The dataset consists of source code and LLVM IR pairs generated from accepted and de-duped programming contest solutions. The dataset is divided into language configs and mode splits. The language can be one of `C`, `C++`, `D`, `Fortran`, `Go`, `Haskell`, `Nim`, `Objective-C`, `Python`, `Rust` and `Swift`, indicating the source files' languages. The mode split indicates the compilation mode, which can be wither `Size_Optimized` or `Perf_Optimized`.
Once you have submitted an access request which has been approved, loading the dataset can be done as follows:
>
```python
from datasets import load_dataset
dataset = load_dataset("UKPLab/SLTrans", "C", split="Size_Optimized")
```
> |
open-llm-leaderboard/details_dvruette__llama-13b-pretrained | ---
pretty_name: Evaluation run of dvruette/llama-13b-pretrained
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/llama-13b-pretrained](https://huggingface.co/dvruette/llama-13b-pretrained)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T17:33:50.415201](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained/blob/main/results_2023-10-18T17-33-50.415201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19431627516778524,\n\
\ \"em_stderr\": 0.004052066229872751,\n \"f1\": 0.25224412751677777,\n\
\ \"f1_stderr\": 0.004066214952392991,\n \"acc\": 0.46513107858970915,\n\
\ \"acc_stderr\": 0.01097629037543693\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19431627516778524,\n \"em_stderr\": 0.004052066229872751,\n\
\ \"f1\": 0.25224412751677777,\n \"f1_stderr\": 0.004066214952392991\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \
\ \"acc_stderr\": 0.010116708586037183\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dvruette/llama-13b-pretrained
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T17_33_50.415201
path:
- '**/details_harness|drop|3_2023-10-18T17-33-50.415201.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T17-33-50.415201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T17_33_50.415201
path:
- '**/details_harness|gsm8k|5_2023-10-18T17-33-50.415201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T17-33-50.415201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:55:00.882635.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:55:00.882635.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T17_33_50.415201
path:
- '**/details_harness|winogrande|5_2023-10-18T17-33-50.415201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T17-33-50.415201.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_55_00.882635
path:
- results_2023-07-19T18:55:00.882635.parquet
- split: 2023_10_18T17_33_50.415201
path:
- results_2023-10-18T17-33-50.415201.parquet
- split: latest
path:
- results_2023-10-18T17-33-50.415201.parquet
---
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained](https://huggingface.co/dvruette/llama-13b-pretrained) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T17:33:50.415201](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained/blob/main/results_2023-10-18T17-33-50.415201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19431627516778524,
"em_stderr": 0.004052066229872751,
"f1": 0.25224412751677777,
"f1_stderr": 0.004066214952392991,
"acc": 0.46513107858970915,
"acc_stderr": 0.01097629037543693
},
"harness|drop|3": {
"em": 0.19431627516778524,
"em_stderr": 0.004052066229872751,
"f1": 0.25224412751677777,
"f1_stderr": 0.004066214952392991
},
"harness|gsm8k|5": {
"acc": 0.1607278241091736,
"acc_stderr": 0.010116708586037183
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wtcherr/unsplash | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1920147531.906
num_examples: 14942
download_size: 1935037165
dataset_size: 1920147531.906
---
# Dataset Card for "unsplash"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lombardata/panoptic_2023_06_29 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: segments_info
list:
- name: area
dtype: int64
- name: bbox
sequence: float64
- name: category_id
dtype: int64
- name: id
dtype: int64
- name: iscrowd
dtype: int64
- name: image_name
dtype: string
splits:
- name: train
num_bytes: 674020155.2
num_examples: 1200
download_size: 659233073
dataset_size: 674020155.2
---
# Dataset Card for "panoptic_2023_06_29"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelpenaariet/PIdemo | ---
language:
- en
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[train]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rajendrabaskota/hc3-wiki-perplexity-stride-32-maxlen-256 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: text
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: perplexity_score
dtype: float64
splits:
- name: test
num_bytes: 41314265
num_examples: 17387
download_size: 21811031
dataset_size: 41314265
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
kristmh/high_vs_random_min_len_1000 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 19282841
num_examples: 7642
- name: train
num_bytes: 157361909
num_examples: 61136
- name: validate
num_bytes: 18779565
num_examples: 7642
download_size: 85467675
dataset_size: 195424315
---
# Dataset Card for "high_vs_random_min_len_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ColtonAi/Oi | ---
license: gpl
task_categories:
- question-answering
language:
- en
tags:
- not-for-all-audiences
- legal
- chemistry
- biology
- medical
pretty_name: kudurru
size_categories:
- n>1T
--- |
irds/clinicaltrials_2019_trec-pm-2019 | ---
pretty_name: '`clinicaltrials/2019/trec-pm-2019`'
viewer: false
source_datasets: ['irds/clinicaltrials_2019']
task_categories:
- text-retrieval
---
# Dataset Card for `clinicaltrials/2019/trec-pm-2019`
The `clinicaltrials/2019/trec-pm-2019` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clinicaltrials#clinicaltrials/2019/trec-pm-2019).
# Data
This dataset provides:
- `queries` (i.e., topics); count=40
- `qrels`: (relevance assessments); count=12,996
- For `docs`, use [`irds/clinicaltrials_2019`](https://huggingface.co/datasets/irds/clinicaltrials_2019)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/clinicaltrials_2019_trec-pm-2019', 'queries')
for record in queries:
record # {'query_id': ..., 'disease': ..., 'gene': ..., 'demographic': ...}
qrels = load_dataset('irds/clinicaltrials_2019_trec-pm-2019', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Roberts2019TrecPm,
title={Overview of the TREC 2019 Precision Medicine Track},
author={Kirk Roberts and Dina Demner-Fushman and Ellen Voorhees and William R. Hersh and Steven Bedrick and Alexander J. Lazar and Shubham Pant and Funda Meric-Bernstam},
booktitle={TREC},
year={2019}
}
```
|
crumbly/tinycode-a | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 3418893547
num_examples: 1123379
download_size: 1191853783
dataset_size: 3418893547
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tinycode-a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/medical_qa_ru_data | ---
dataset_info:
features:
- name: date
dtype: string
- name: categ
dtype: string
- name: theme
dtype: string
- name: desc
dtype: string
- name: ans
dtype: string
- name: spec10
dtype: string
splits:
- name: train
num_bytes: 268150120
num_examples: 190335
download_size: 132020030
dataset_size: 268150120
---
# Dataset Card for "medical_qa_ru_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Har11k/QAdataset.1 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
pretty_name: s
--- |
Nexdata/Number_Speech_Data_in_Mandarin_and_Dialects_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Number_Speech_Data_in_Mandarin_and_Dialects_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/250?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Digital dialect Mandarin audio data captured by mobile phone, with the duration of 66 hours; 592 people participated in the recording, with balanced gender distribution; the languages include Sichuan dialect, Cantonese, and Mandarin; content covers daily life scenes; matching with mainstream Android, Apple system mobile phones; this data set can be used for automatic speech recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/250?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Mandarin, Chinese Dialects
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
polinaeterna/pleiades | ---
configs:
- config_name: locations
data_files: pleiades-locations-latest.csv
- config_name: names
data_files: pleiades-names-latest.csv
- config_name: places
data_files: pleiades-places-latest.csv
license: cc
--- |
ovelozz/dataset | ---
license: openrail
---
|
mikewang/padv2 | ---
pretty_name: 'Padv2 Dataset - Part1'
language:
- en
---
# Dataset Card for Padv2 Part1
## Dataset Description
**Official Repo:** https://github.com/lhc1224/OSAD_Net#-dataset-;
**IMPORTANT Notes**:
- This Huggingface dataset loads the Part1 of the Padv2 dataset, i.e., the PADv2_part1.zip; The file can also be downloaded from: https://uofi.box.com/s/1atjh3d2p82qyxm3gp11514006va0llq
- Each instance in the loaded HF dataset contains the following fields:
- `image_uid`: unique id to a dataset instanec
- `image_path`: path to the raw rgb image
- `depth_path`: path to the depth annotation of the image
- `mask_path`: path to the object mask of the image
- `affordance_type`: affordance type of the object in the image
- `original_divisions`: there are three versions of divisions on the affordance types in the original dataset, this field stores the split ("train" or "test") of this instance in the three different divisions ("divide_1", "divide_2", "divide_3")
**Paper Citation:**
```
@inproceedings{Oneluo,
title={One-Shot Affordance Detection},
author={Hongchen Luo and Wei Zhai and Jing Zhang and Yang Cao and Dacheng Tao},
booktitle={IJCAI},
year={2021}
}
```
```
@article{luo2021one,
title={One-Shot Object Affordance Detection in the Wild},
author={Zhai, Wei and Luo, Hongchen and Zhang, Jing and Cao, Yang and Tao, Dacheng},
journal={arXiv preprint arXiv:2108.03658},
year={2021}
}
```
## Dataset Summary
With complex scenes and rich annotations, the PADv2 dataset can be used as a test bed to benchmark affordance detection methods and may also facilitate downstream vision tasks, such as scene understanding, action recognition, and robot manipulation.
It contains 30k diverse images covering 39 affordance categories as well as 103 object categories from different scenes. |
yukihirop/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 13767402
num_examples: 1000
download_size: 3622731
dataset_size: 13767402
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
annotations_creators: []
language: []
language_creators: []
license: []
multilinguality: []
pretty_name: HuggingFace GitHub Issues
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-classification
- text-retrieval
task_ids:
- multi-class-classification
- multi-label-classification
- document-retrieval
---
|
napatswift/pmt000 | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 58702671.0
num_examples: 109
download_size: 58746798
dataset_size: 58702671.0
---
# Dataset Card for "pmt000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x22almostEvil/ru-riddles-377 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- ru
tags:
- QnA
- Riddles
size_categories:
- n<1K
---
# Dataset Card for Russian riddles with answers with 377 entries.
### Dataset Summary
Contains parquet of QnA with riddle & answer pairs.
Each row consists of
* INSTRUCTION
* RESPONSE
* SOURCE
* METADATA (json with language).
### Licensing Information
Data is scrapped from several sites. Since most of the riddles and answers are publicly available and popular, any ToS and licensing of the sites themselves is irrelevant. I reserve the right to put a public and permissive license.
Moreover, there was no licensing information on these sites, which makes sense, due to the public availability and prominence of the content they provide.
### Acknowledgements
Thanks Freddie#5762 for providing this data!
He mentioned these URLs:
- https://azbyka.ru/deti/logicheskie-i-zanimatelnye-zadachi
- https://bbf.ru/riddles/ |
MadhuLokanath/New_Data | ---
license: apache-2.0
---
|
benayas/banking_augmented_5pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1040273
num_examples: 10003
download_size: 407790
dataset_size: 1040273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pvisnrt/french-snli | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: translated_premise
dtype: string
- name: translated_hypothesis
dtype: string
splits:
- name: test
num_bytes: 2291311
num_examples: 10000
- name: train
num_bytes: 122397311
num_examples: 550152
- name: validation
num_bytes: 2301319
num_examples: 10000
download_size: 40410905
dataset_size: 126989941
---
# Dataset Card for "french-snli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_163 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21288462912.5
num_examples: 221644
download_size: 19616462343
dataset_size: 21288462912.5
---
# Dataset Card for "chunk_163"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Superdetec/Mscdude | ---
license: openrail
---
|
next-social/reddit_crush | ---
dataset_info:
features:
- name: selftext
dtype: string
- name: subreddit
dtype: string
splits:
- name: train
num_bytes: 91006275
num_examples: 114942
download_size: 0
dataset_size: 91006275
---
# Dataset Card for "reddit_crush"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-conll2003-conll2003-bc26c9-1485554292 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: baptiste/deberta-finetuned-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: baptiste/deberta-finetuned-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
maxolotl/must-c-en-fr-wait05_22.22 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 1117394934
num_examples: 5530635
- name: test
num_bytes: 12413160
num_examples: 64317
- name: validation
num_bytes: 5823766
num_examples: 29172
download_size: 186632709
dataset_size: 1135631860
---
# Dataset Card for "must-c-en-fr-wait05_22.22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gayanin/kaggle-native-v8-vocab-noised | ---
dataset_info:
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 556420
num_examples: 5140
- name: test
num_bytes: 70643
num_examples: 643
- name: validation
num_bytes: 69615
num_examples: 643
download_size: 308248
dataset_size: 696678
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f247faaa | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1337
dataset_size: 180
---
# Dataset Card for "f247faaa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyh589/Personagens | ---
license: unknown
---
|
open-llm-leaderboard/details_AGI-0__ThetaWave-7B-v0.1 | ---
pretty_name: Evaluation run of AGI-0/ThetaWave-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AGI-0/ThetaWave-7B-v0.1](https://huggingface.co/AGI-0/ThetaWave-7B-v0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AGI-0__ThetaWave-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T20:04:17.060853](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-0__ThetaWave-7B-v0.1/blob/main/results_2024-02-29T20-04-17.060853.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.633104851512329,\n\
\ \"acc_stderr\": 0.032702067296011314,\n \"acc_norm\": 0.6350523719072316,\n\
\ \"acc_norm_stderr\": 0.03336817115216934,\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6326747917681392,\n\
\ \"mc2_stderr\": 0.015330849945700742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892976\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6759609639514041,\n\
\ \"acc_stderr\": 0.004670581884781163,\n \"acc_norm\": 0.8571997610037841,\n\
\ \"acc_norm_stderr\": 0.0034915398589272896\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486863,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486863\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357304,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357304\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6326747917681392,\n\
\ \"mc2_stderr\": 0.015330849945700742\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156878\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5564821834723275,\n \
\ \"acc_stderr\": 0.013684327592606165\n }\n}\n```"
repo_url: https://huggingface.co/AGI-0/ThetaWave-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-04-17.060853.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-04-17.060853.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- '**/details_harness|winogrande|5_2024-02-29T20-04-17.060853.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T20-04-17.060853.parquet'
- config_name: results
data_files:
- split: 2024_02_29T20_04_17.060853
path:
- results_2024-02-29T20-04-17.060853.parquet
- split: latest
path:
- results_2024-02-29T20-04-17.060853.parquet
---
# Dataset Card for Evaluation run of AGI-0/ThetaWave-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AGI-0/ThetaWave-7B-v0.1](https://huggingface.co/AGI-0/ThetaWave-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AGI-0__ThetaWave-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T20:04:17.060853](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-0__ThetaWave-7B-v0.1/blob/main/results_2024-02-29T20-04-17.060853.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.633104851512329,
"acc_stderr": 0.032702067296011314,
"acc_norm": 0.6350523719072316,
"acc_norm_stderr": 0.03336817115216934,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6326747917681392,
"mc2_stderr": 0.015330849945700742
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892976
},
"harness|hellaswag|10": {
"acc": 0.6759609639514041,
"acc_stderr": 0.004670581884781163,
"acc_norm": 0.8571997610037841,
"acc_norm_stderr": 0.0034915398589272896
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486863,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486863
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357304,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357304
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6326747917681392,
"mc2_stderr": 0.015330849945700742
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156878
},
"harness|gsm8k|5": {
"acc": 0.5564821834723275,
"acc_stderr": 0.013684327592606165
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
web2write/kicowrite | ---
license: cc-by-4.0
---
|
qiezhian/open-test | ---
license: apache-2.0
---
|
AlekseyKorshuk/chatml-evaluation | ---
dataset_info:
features:
- name: prompt
list:
- name: from
dtype: string
- name: role_type
dtype: string
- name: value
dtype: string
- name: response
struct:
- name: from
dtype: string
- name: role_type
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1442834
num_examples: 319
download_size: 0
dataset_size: 1442834
---
# Dataset Card for "chatml-evaluation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qa_zre | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: QaZre
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- question-answering
task_ids: []
paperswithcode_id: null
tags:
- zero-shot-relation-extraction
dataset_info:
features:
- name: relation
dtype: string
- name: question
dtype: string
- name: subject
dtype: string
- name: context
dtype: string
- name: answers
sequence: string
splits:
- name: test
num_bytes: 29410194
num_examples: 120000
- name: validation
num_bytes: 1481430
num_examples: 6000
- name: train
num_bytes: 2054954011
num_examples: 8400000
download_size: 516061636
dataset_size: 2085845635
---
# Dataset Card for QaZre
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://nlp.cs.washington.edu/zeroshot](http://nlp.cs.washington.edu/zeroshot)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 516.06 MB
- **Size of the generated dataset:** 2.09 GB
- **Total amount of disk used:** 2.60 GB
### Dataset Summary
A dataset reducing relation extraction to simple reading comprehension questions
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 516.06 MB
- **Size of the generated dataset:** 2.09 GB
- **Total amount of disk used:** 2.60 GB
An example of 'validation' looks as follows.
```
{
"answers": [],
"context": "answer",
"question": "What is XXX in this question?",
"relation": "relation_name",
"subject": "Some entity Here is a bit of context which will explain the question in some way"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `relation`: a `string` feature.
- `question`: a `string` feature.
- `subject`: a `string` feature.
- `context`: a `string` feature.
- `answers`: a `list` of `string` features.
### Data Splits
| name | train | validation | test |
|---------|--------:|-----------:|-------:|
| default | 8400000 | 6000 | 120000 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Unknown.
### Citation Information
```
@inproceedings{levy-etal-2017-zero,
title = "Zero-Shot Relation Extraction via Reading Comprehension",
author = "Levy, Omer and
Seo, Minjoon and
Choi, Eunsol and
Zettlemoyer, Luke",
booktitle = "Proceedings of the 21st Conference on Computational Natural Language Learning ({C}o{NLL} 2017)",
month = aug,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/K17-1034",
doi = "10.18653/v1/K17-1034",
pages = "333--342",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@ghomasHudson](https://github.com/ghomasHudson), [@lewtun](https://github.com/lewtun) for adding this dataset. |
hieunguyen1053/hdpl_sft | ---
dataset_info:
features:
- name: url
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 15414457
num_examples: 3903
download_size: 5580453
dataset_size: 15414457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jpwahle/etpc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: Extended Paraphrase Typology Corpus
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Card for [Dataset Name]](#dataset-card-for-dataset-name)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/venelink/ETPC/
- **Repository:**
- **Paper:** [ETPC - A Paraphrase Identification Corpus Annotated with Extended Paraphrase Typology and Negation](http://www.lrec-conf.org/proceedings/lrec2018/pdf/661.pdf)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
We present the Extended Paraphrase Typology (EPT) and the Extended Typology Paraphrase Corpus (ETPC). The EPT typology addresses several practical limitations of existing paraphrase typologies: it is the first typology that copes with the non-paraphrase pairs in the paraphrase identification corpora and distinguishes between contextual and habitual paraphrase types. ETPC is the largest corpus to date annotated with atomic paraphrase types. It is the first corpus with detailed annotation of both the paraphrase and the non-paraphrase pairs and the first corpus annotated with paraphrase and negation. Both new resources contribute to better understanding the paraphrase phenomenon, and allow for studying the relationship between paraphrasing and negation. To the developers of Paraphrase Identification systems ETPC corpus offers better means for evaluation and error analysis. Furthermore, the EPT typology and ETPC corpus emphasize the relationship with other areas of NLP such as Semantic Similarity, Textual Entailment, Summarization and Simplification.
### Supported Tasks and Leaderboards
- `text-classification`
### Languages
The text in the dataset is in English (`en`).
## Dataset Structure
### Data Fields
- `idx`: Monotonically increasing index ID.
- `sentence1`: Complete sentence expressing an opinion about a film.
- `sentence2`: Complete sentence expressing an opinion about a film.
- `etpc_label`: Whether the text-pair is a paraphrase, either "yes" (1) or "no" (0) according to etpc annotation schema.
- `mrpc_label`: Whether the text-pair is a paraphrase, either "yes" (1) or "no" (0) according to mrpc annotation schema.
- `negation`: Whether on sentence is a negation of another, either "yes" (1) or "no" (0).
### Data Splits
train: 5801
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Rotten Tomatoes reviewers.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Unknown.
### Citation Information
```bibtex
@inproceedings{kovatchev-etal-2018-etpc,
title = "{ETPC} - A Paraphrase Identification Corpus Annotated with Extended Paraphrase Typology and Negation",
author = "Kovatchev, Venelin and
Mart{\'\i}, M. Ant{\`o}nia and
Salam{\'o}, Maria",
booktitle = "Proceedings of the Eleventh International Conference on Language Resources and Evaluation ({LREC} 2018)",
month = may,
year = "2018",
address = "Miyazaki, Japan",
publisher = "European Language Resources Association (ELRA)",
url = "https://aclanthology.org/L18-1221",
}
```
### Contributions
Thanks to [@jpwahle](https://github.com/jpwahle) for adding this dataset. |
paezand/malloc | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 753000000
num_examples: 1000000
download_size: 135163516
dataset_size: 753000000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ejazhabibdar/EjazHabibDar | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3527550.0
num_examples: 30
download_size: 3486974
dataset_size: 3527550.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mychen76/wiki_medical_terms_llama2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42966707.83151144
num_examples: 5488
- name: test
num_bytes: 10749506.168488558
num_examples: 1373
- name: validation
num_bytes: 2153032.917941991
num_examples: 275
download_size: 29713610
dataset_size: 55869246.91794199
---
# Dataset Card for "wiki_medical_terms_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cosc/cutesexyrobutts | ---
license: creativeml-openrail-m
---
407 images and captions taken from danbooru, picked and cropped by hand, 768x768 size. |
KeshavRa/Qualify_Apply_For_Village_Database | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 9855
num_examples: 36
download_size: 7898
dataset_size: 9855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VuongQuoc/test_chemistry | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1003753.0
num_examples: 592
download_size: 1016896
dataset_size: 1003753.0
---
# Dataset Card for "test_chemistry"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/banking_artificial_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1049467
num_examples: 10003
download_size: 325096
dataset_size: 1049467
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713089836 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2527
num_examples: 9
download_size: 7022
dataset_size: 2527
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713089836"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daniilak/vk_groups | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- ru
pretty_name: VK.Groups
size_categories:
- 100M<n<1B
---
### Dataset
The data set contains a list of all public pages (communities or groups) of the social network VKontakte (VK.COM).
The current number is 222,130,000 communities.
The dataset has 25 fields. CSV files are delimited by "\t".
There is also a list of verified groups - 41614 elements
### Fields
Full versions contain the following fields:
["id", "screen_name", "members_count", "name", "type", "verified", "description", "activity", "can_see_all_posts", "city_id", "city_title", "contacts", "country_id", "country_title", "deactivated", "deactivated_message", "deactivated_type", "finish_date", "is_closed", "photo_100", "photo_200", "photo_50", "site", "start_date", "status"]
Minified versions contain the following fields:
[ "id", "members_count", "name", "type", "verified", "activity", "city_id", "country_id", "deactivated", "finish_date", "is_closed", "site"]
Description:
* id - integer Community ID
* screen_name - string Community name
* members_count - string Short address, for example, apiclub
* name - string Community name.
* type - string Community type: group — group; page - public page; event
* verified - integer Information about whether the community has been verified. Possible values: 1 - is; 0 - is not
* description - string Community description text
* activity - string Public theme string. For groups, a string value is returned, whether the group is open or not, and for events, the start date
* can_see_all_posts - integer Information about whether it is allowed to see other people's posts on the community wall. Possible values: 1 - can; 0 - cannot
* city_id - integer id of the city specified in the community information
* city_title - - integer name of the city specified in the community information
* contacts - json-array Information from the contact block of the public page. An array of objects, each of which can contain fields: user_id (integer) — user ID; desc (string) - position; phone (string) — phone number; email (string) — email address
* country_id - integer ID of the country specified in the community information
* country_title - string name of the country specified in the community information
* deactivated - string Returned if the community has been deleted or disabled. Possible values: deleted — the community has been deleted; banned - the community is blocked;
* deactivated_message - string Reason for blocking the community
* deactivated_type - string Returned if the community is deleted or banned, contains deleted or banned
* finish_date - Meeting communities contain the end time of the meeting in unixtime format. For public pages, it contains only start_date — the date of foundation in YYYYMMDD format
* is_closed - integer Whether the community is closed. Possible values: 0 — open; 1 - closed; 2 - private
* photo_100 - string URL of the main photo with a size of 100x100px
* photo_200 - string URL of the main photo in the maximum size
* photo_50 - string URL of the main photo with size 50x50px
* site - string Site address specified in the profile.
* start_date - Meeting communities contain the start time of the meeting in unixtime format. For public pages, it contains only start_date — the date of foundation in YYYYMMDD format
* status - string Community status
### Dataset Creation
The data was scraped through [https://dev.vk.com/ru/method/groups.getById] (VK API Method)
### License
The license for this dataset is public, you can use it in your scientific research, design work and other works. The only condition is the publication of a link to this dataset
## RU
### Набор данных
Набор данных содержит список всех публичных страниц (или, как их называют, сообщества или группы) социальной сети ВКонтакте.
Текущее число составляет 222 130 000 групп.
Датасет имеет 25 полей. В качестве разделителя используется символ табуляции "\t".
Также есть список верифицированных групп - 41614 элементов
### Поля
Полная версия содержит следующие поля:
["id", "screen_name", "members_count", "name", "type", "verified", "description", "activity", "can_see_all_posts", "city_id", "city_title", "contacts", "country_id", "country_title", "deactivated", "deactivated_message", "deactivated_type", "finish_date", "is_closed", "photo_100", "photo_200", "photo_50", "site", "start_date", "status"]
Минифицированная версия:
[ "id", "members_count", "name", "type", "verified", "activity", "city_id", "country_id", "deactivated", "finish_date", "is_closed", "site"]
Подробно:
* id - integer Идентификатор сообщества
* screen_name - string Название сообщества
* members_count - string Короткий адрес, например, apiclub
* name - string Название сообщества.
* type - string Тип сообщества: group — группа; page — публичная страница; event — мероприятие
* verified - integer Информация о том, верифицировано ли сообщество. Возможные значения: 1 — является; 0 — не является
* description - string Текст описания сообщества
* activity - string Строка тематики паблика. У групп возвращается строковое значение, открыта ли группа или нет, а у событий дата начала
* can_see_all_posts - integer Информация о том, разрешено ли видеть чужие записи на стене сообщества. Возможные значения: 1 — может; 0 — не может
* city_id - integer идентификатор города, указанный в информации о сообществе
* city_title - - integer название города, указанный в информации о сообществе
* contacts - json-array Информация из блока контактов публичной страницы. Массив объектов, каждый из которых может содержать поля: user_id (integer) — идентификатор пользователя; desc (string) — должность; phone (string) — номер телефона; email (string) — адрес e-mail
* country_id - integer идентификатор страны, указанной в информации о сообществе
* country_title - string название страны, указанной в информации о сообществе
* deactivated - string Возвращается в случае, если сообщество удалено или заблокировано. Возможные значения: deleted — сообщество удалено; banned — сообщество заблокировано;
* deactivated_message - string Причина блокировки сообщества
* deactivated_type - string Возвращается, если сообщество удалено или заблокировано, содержит значение deleted или banned
* finish_date - Сообщества-встречи содержат время конца встречи в формате unixtime. Для публичных страниц содержит только start_date — дата основания в формате YYYYMMDD
* is_closed - integer Является ли сообщество закрытым. Возможные значения: 0 — открытое; 1 — закрытое; 2 — частное
* photo_100 - string URL главной фотографии с размером 100х100px
* photo_200 - string URL главной фотографии в максимальном размере
* photo_50 - string URL главной фотографии с размером 50x50px
* site - string Адрес сайта, указанный в профиле.
* start_date - Сообщества-встречи содержат время начала встречи в формате unixtime. Для публичных страниц содержит только start_date — дата основания в формате YYYYMMDD
* status - string Статус сообщества
### Лицензия
Лицензия на этот набор данных общедоступная, вы можете использовать его в своих научных исследованиях, проектных работах и других работах. Единственное условие — публикация ссылки на этот набор данных. |
mask-distilled-one-sec-cv12/chunk_160 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1179566892
num_examples: 231651
download_size: 1203138651
dataset_size: 1179566892
---
# Dataset Card for "chunk_160"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
asas-ai/ArTrivia | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: title
dtype: string
- name: paragraphs
list:
- name: context
dtype: string
- name: qas
list:
- name: answers
list:
- name: answer_start
dtype: int64
- name: text
dtype: string
- name: id
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 9477598
num_examples: 8345
- name: validation
num_bytes: 1999664
num_examples: 1700
download_size: 5199179
dataset_size: 11477262
task_categories:
- question-answering
language:
- ar
pretty_name: ArTrivia
---
# Dataset Card for "ArTrivia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keremberke/csgo-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="keremberke/csgo-object-detection" src="https://huggingface.co/datasets/keremberke/csgo-object-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['ct', 'cthead', 't', 'thead']
```
### Number of Images
```json
{'train': 3879, 'valid': 383, 'test': 192}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/csgo-object-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/asd-culfr/wlots/dataset/1](https://universe.roboflow.com/asd-culfr/wlots/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ wlots_dataset,
title = { wlots Dataset },
type = { Open Source Dataset },
author = { asd },
howpublished = { \\url{ https://universe.roboflow.com/asd-culfr/wlots } },
url = { https://universe.roboflow.com/asd-culfr/wlots },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { may },
note = { visited on 2023-01-27 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on December 28, 2022 at 8:08 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 4454 images.
Ct-cthead-t-thead are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 416x416 (Fill (with center crop))
The following augmentation was applied to create 3 versions of each source image:
* Random brigthness adjustment of between -15 and +15 percent
|
Aehus/bumblebee_1 | ---
dataset_info:
features:
- name: new_output
dtype: string
- name: new_input
dtype: string
- name: new_instruction
dtype: string
splits:
- name: train
num_bytes: 5299101
num_examples: 5457
download_size: 2701971
dataset_size: 5299101
---
# Dataset Card for "bumblebee_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jgwill/gia-young-picasso-v01-201208 | ---
license: gpl-3.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.