datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jilp00/youtoks-curious-amalgam-v2-inst | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5938879
num_examples: 9358
download_size: 2549900
dataset_size: 5938879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
atgarcia/EMGSoundTrain3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: emg
sequence:
sequence: float64
- name: emg_sound
sequence: float64
splits:
- name: train
num_bytes: 2237388910
num_examples: 1000
download_size: 752700675
dataset_size: 2237388910
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pszymanski/allegro-polish-summaries-corpus-llama2-2000rows | ---
license: apache-2.0
---
|
CATIE-AQ/orange_sum_fr_prompt_text_generation_from_an_article | ---
language:
- fr
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- orange_sum
---
# orange_sum_fr_prompt_text_generation_from_an_article
## Summary
**orange_sum_fr_prompt_text_generation_from_an_article** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **539,400** rows that can be used for a text generation task.
The original data (without prompts) comes from the dataset [orange_sum](https://huggingface.co/datasets/orange_sum) by Eddine et al.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'"'+document+'"\n Continuer le texte sur 1000 caractères maximum :',
'"'+document+'"\n Continue le texte sur 1000 caractères maximum :',
'"'+document+'"\n Continuez le texte sur 1000 caractères maximum :',
'"'+document+'"\n Poursuivre le texte sur 1000 caractères maximum :',
'"'+document+'"\n Poursuis le texte sur 1000 caractères maximum :',
'"'+document+'"\n Poursuivez le texte sur 1000 caractères maximum :',
'"'+document+'"\n Prolonger le texte sur 1000 caractères maximum :',
'"'+document+'"\n Prolonge le texte sur 1000 caractères maximum :',
'"'+document+'"\n Prolongez le texte sur 1000 caractères maximum :',
'"'+document+'"\n Rédiger la suite du texte : ',
'"'+document+'"\n Rédige la suite du texte : ',
'"'+document+'"\n Rédigez la suite du texte : ',
'"'+document+'"\n Imaginer la suite du texte : ',
'"'+document+'"\n Imagine la suite du texte : ',
'"'+document+'"\n Imaginez la suite du texte : ',
'"'+document+'"\n Ecrire la suite du texte : ',
'"'+document+'"\n Ecris la suite du texte : ',
'"'+document+'"\n Ecriver la suite du texte : ',
'"'+document+'"\n Développer la suite du texte : ',
'"'+document+'"\n Développe la suite du texte : ',
'"'+document+'"\n Développez la suite du texte : ',
'"'+document+'"\nGénérer la suite du texte : ',
'"'+document+'"\nGénère la suite du texte : ',
'"'+document+'"\n Générez la suite du texte : ',
```
### Features used in the prompts
In the prompt list above, `text` and `targets` have been constructed from:
```
orange_sum = load_dataset('orange_sum','abstract')
if len(orange_sum['train'][i]['text']) > 1000:
document = orange_sum['train'][i]['text'][:1000]
targets = orange_sum['train'][i]['summary'][1000:]
```
# Splits
- `train` with 472,944 samples
- `valid` with 33,096 samples
- `test` with 33,360 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/orange_sum_fr_prompt_text_generation_from_an_article")
```
# Citation
## Original data
> @article{eddine2020barthez,
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
journal={arXiv preprint arXiv:2010.12321},
year={2020}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC-BY-SA-4.0 |
monology/bagel-v0.3 | ---
license: cc-by-4.0
---
Just a backup of [jondurbin/bagel-v0.3](https://huggingface.co/datasets/jondurbin/bagel-v0.3) in .jsonl.zst format. |
Mxode/University-News-Instruction-Zh | ---
license: apache-2.0
task_categories:
- zero-shot-classification
- summarization
- text-generation
tags:
- news
- campus
size_categories:
- 100K<n<1M
language:
- zh
---
一些高校校园新闻,约 65k * 3(类任务) 条,**稍微做了一点点脱敏,尽可能地遮盖了作者名等**。数据已经整理成了指令的形式,格式如下:
```json
{
"id": <id>,
"category": "(title_summarize|news_classify|news_generate)",
"instruction": <对应的具体指令>,
"input": <空>,
"output": <指令对应的输出>
}
```
总共三类任务:**标题总结**、**栏目分类**、**新闻生成**,本质上是利用新闻元数据中的标题、栏目、内容排列组合生成的,所以可以保证数据完全准确。每个字段内容已经整理成了**单行**的格式。下面是三类任务的样例:
```json
// 标题总结
{
"id": 22106,
"category": "title_summarize",
"instruction": "请你给下面的新闻取一则标题:\n点击图片观看视频 科技创新潜质人才是海院自主选拔的一种新尝试,根据规定,凡在高中阶段获得省级科技创新大赛一等奖或以上的应届高中毕业生,都可申请报名。今年苏浙沪三地共有300多名学生,申请以科技创新潜质人才身份入学海城海院。经过专家甄选,今天上午,有82名学生参加了考试。 上午的面试不考语数外,考生要靠自己的科技成果来证明自己是具有科技创新潜质的人才。记者发现,学生们带来的创新成果五花八门,涉及生物、微电子等多种前沿学科。经过上午的筛选,下午只剩下一半的学生进入了特色测试部分,这部分仍不考语数外,重点考察学生的发散性思维,及兴趣爱好和潜质,海院将根据测试结果,最终确定获得科技创新潜质人才的自主选拔认定名单,进入名单的考生,高考成绩只要达到当地的一本线,就能被海院录取。",
"input": "",
"output": "该新闻的标题可以是:《不考语数外 海院自主选拔创新潜质人才》"
}
```
```json
// 栏目分类
{
"id": 134781,
"category": "news_classify",
"instruction": "根据下面的新闻,归纳出它的分类:\n## 英国格拉斯哥大学医学院副院长访问海城海院生命学院\n\n11月20日,英国格拉斯哥大学(University of Glasgow,以下简称格大)医学院副院长Chris Loughrey一行访问海城海院生命科学技术学院,生命学院副院长张晓君出席会议,魏芳、王晓雷、徐沁等多位教师与会参加交流。格大和海城海院同为国际大学交流组织Universitas 21的成员,在生命和医学领域合作紧密。双方互相介绍了两校在生命科学领域的学科专业、课程体系、学生培养等发展情况。在相互交流中,双方希望在这个宏观的合作框架下,逐步增进相互了解,在人才培养、科学研究等方面形成更多更实质性的合作。格拉斯哥大学是英联邦地区建校历史最悠久的四所大学之一。研究经费及教育质量均排名英国前十,出过7位诺奖获得者及3位英国首相。格大下设四大学院,其中医学院有教职员工2000余人,学生5000余人,医学院下设七个研究中心,涉及分子/细胞/系统生物学、植物、微生物、免疫、癌症、生物多样性、神经科学、心血管医学等领域。",
"input": "",
"output": "本则新闻的分类栏目是:综合新闻"
}
```
```json
// 新闻生成
{
"id": 61402,
"category": "news_generate",
"instruction": "帮我写一则综合新闻栏目的新闻,标题是机械与动力工程学院举办心理健康月系列活动之“与压力一路同行”",
"input": "",
"output": "《机械与动力工程学院举办心理健康月系列活动之“与压力一路同行”》:\n\n5月27日,机械与动力工程学院心理健康月系列活动之“与压力一路同行”讲座在高田会堂举行。活动特邀海城铁路学院兼职心理咨询师李毅作为主讲嘉宾,与大家分享如何正确认识压力以及积极应对压力 。本次活动由机械与动力工程学院王睿瑞主持。相较于受到心理教育较为全面的本科生同学,研究生同学面临更多的现实问题,更加需要心理健康的知识,来实现压力与情绪的合理管理。李毅首先结合心理学知识,从生理学、心理学等多个角度带领同学们认识了压力的本质,并指出并非所有压力都是有害的,负性的慢性压力对人产生危害,合理规避有害的压力尤为重要,关键是找到自己的最佳压力水平。李毅进一步讲述了心理压力与认知、情绪、行为之间的关系。通过对“情绪的身体地图”的说明,使同学们更加深刻地体会到了“情绪丰富了体验,情绪述说着需求,情绪激发出行为,情绪联结了他人”的含义。最后,李毅从时间管理、简化生活、认知需求、活在当下、友善待人等方面给出了应对压力的建议,并寄语大家“与并不完美的自我共处,与压力一路同行”。机械与动力工程学院心理健康月系列活动已举办4年,覆盖2700余名研究生,力求以多样的活动形式帮助学生和导师了解更多的心理知识,引导大家关爱自己、关心他人。"
}
```
这里也给出数据的分位点(`instruction` + `output` 的长度),可按需修剪:
```
count 196101.000000
mean 1059.199912
std 571.623822
min 32.000000
0% 32.000000
5% 369.000000
10% 471.000000
15% 539.000000
20% 596.000000
25% 649.000000
30% 700.000000
35% 753.000000
40% 806.000000
45% 862.000000
50% 920.000000
55% 983.000000
60% 1053.000000
65% 1132.000000
70% 1225.000000
75% 1337.000000
80% 1473.000000
85% 1652.000000
90% 1898.000000
95% 2270.000000
max 3054.000000
``` |
open-llm-leaderboard/details_psmathur__model_007_13b | ---
pretty_name: Evaluation run of psmathur/model_007_13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_007_13b](https://huggingface.co/psmathur/model_007_13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_007_13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-11T11:34:56.294632](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_13b/blob/main/results_2023-08-11T11%3A34%3A56.294632.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2314240573187148,\n\
\ \"acc_stderr\": 0.03071122006512167,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_007_13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|arc:challenge|25_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hellaswag|10_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:37:17.110700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:37:17.110700.parquet'
- split: 2023_08_11T11_34_56.294632
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-11T11:34:56.294632.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-11T11:34:56.294632.parquet'
- config_name: results
data_files:
- split: 2023_08_09T13_37_17.110700
path:
- results_2023-08-09T13:37:17.110700.parquet
- split: 2023_08_11T11_34_56.294632
path:
- results_2023-08-11T11:34:56.294632.parquet
- split: latest
path:
- results_2023-08-11T11:34:56.294632.parquet
---
# Dataset Card for Evaluation run of psmathur/model_007_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_007_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_007_13b](https://huggingface.co/psmathur/model_007_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_007_13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-11T11:34:56.294632](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_13b/blob/main/results_2023-08-11T11%3A34%3A56.294632.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2314240573187148,
"acc_stderr": 0.03071122006512167,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
heliosprime/twitter_dataset_1713020788 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 14564
num_examples: 33
download_size: 10123
dataset_size: 14564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713020788"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JWBickel/Concordances_And_Cross_References | ---
language:
- en
---
The phrases in Phrasal Concordance are chunked by Strongs numbers. The Cross References have THREE levels: carat (columns), comma (references), and percent (votes).
BLB (Blue Letter Bible) Greek
-
GroupID ^ ReferenceID ^ Reference
BLB Hebrew
-
GroupID ^ ReferenceID ^ Reference
Chained Phrasal Concordances
-
Reference ^ StrongsChunkedPhraseChain
Cross References
-
RowID ^ ReferencedPassage ^ ReferencingPassagesWithVoteCounts
Phrasal Concordance
-
Phrase ^ Reference ^ Count |
dhiruHF/DocQA-demo-dataset | ---
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 65935
num_examples: 20
download_size: 30041
dataset_size: 65935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "DocQA-demo-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kuraue_hinata_encouragementofclimb | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kuraue Hinata
This is the dataset of Kuraue Hinata, containing 299 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 299 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 722 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 876 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 299 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 299 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 299 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 722 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 722 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 625 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 876 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 876 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
HydraLM/physics_dataset_list_dict | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 49797238
num_examples: 19999
download_size: 23668055
dataset_size: 49797238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_list_dict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suolyer/cmqa | ---
license: apache-2.0
---
|
Ti-Ma/wikipedia_2011 | ---
license: cc-by-sa-3.0
---
|
princeton-nlp/SWE-bench_bm25_13k_cl100k | ---
dataset_info:
features:
- name: base_commit
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: test_patch
dtype: string
- name: repo
dtype: string
- name: problem_statement
dtype: string
- name: version
dtype: string
- name: instance_id
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
- name: patch
dtype: string
splits:
- name: test
num_bytes: 278496488
num_examples: 2294
download_size: 114205622
dataset_size: 278496488
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
### Dataset Summary
SWE-bench is a dataset that tests systems’ ability to solve GitHub issues automatically. The dataset collects 2,294 Issue-Pull Request pairs from 12 popular Python. Evaluation is performed by unit test verification using post-PR behavior as the reference solution.
### Supported Tasks and Leaderboards
SWE-bench proposes a new task: issue resolution provided a full repository and GitHub issue. The leaderboard can be found at www.swebench.com
### Languages
The text of the dataset is primarily English, but we make no effort to filter or otherwise clean based on language type.
## Dataset Structure
### Data Instances
An example of a SWE-bench datum is as follows:
```
instance_id: (str) - A formatted instance identifier, usually as repo_owner__repo_name-PR-number.
patch: (str) - The gold patch, the patch generated by the PR (minus test-related code), that resolved the issue.
repo: (str) - The repository owner/name identifier from GitHub.
base_commit: (str) - The commit hash of the repository representing the HEAD of the repository before the solution PR is applied.
hints_text: (str) - Comments made on the issue prior to the creation of the solution PR’s first commit creation date.
created_at: (str) - The creation date of the pull request.
test_patch: (str) - A test-file patch that was contributed by the solution PR.
Problem_statement: (str) - The issue title and body.
Version: (str) - Installation version to use for running evaluation.
environment_setup_commit: (str) - commit hash to use for environment setup and installation.
FAIL_TO_PASS: (str) - A json list of strings that represent the set of tests resolved by the PR and tied to the issue resolution.
PASS_TO_PASS: (str) - A json list of strings that represent tests that should pass before and after the PR application.
text: (str) - The generated text according to the retrieval criterion and the style-2 prompt found in [github:SWE-bench](https://github.com/princeton-nlp/SWE-bench).
input_ids: (List[int]) - The cl100k_base tokens for each text.
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1 | ---
pretty_name: Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LordNoah/Alpaca_refine_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T16:13:20.086955](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1/blob/main/results_2024-01-22T16-13-20.086955.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2707744494888737,\n\
\ \"acc_stderr\": 0.031386153673103406,\n \"acc_norm\": 0.272619145262433,\n\
\ \"acc_norm_stderr\": 0.03218390044538922,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.37888278063696673,\n\
\ \"mc2_stderr\": 0.014137600334109192\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313366,\n\
\ \"acc_norm\": 0.29180887372013653,\n \"acc_norm_stderr\": 0.013284525292403508\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36367257518422624,\n\
\ \"acc_stderr\": 0.004800728138792374,\n \"acc_norm\": 0.4534953196574388,\n\
\ \"acc_norm_stderr\": 0.004968151878211051\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3283018867924528,\n \"acc_stderr\": 0.028901593612411784,\n\
\ \"acc_norm\": 0.3283018867924528,\n \"acc_norm_stderr\": 0.028901593612411784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083287,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083287\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.32019704433497537,\n \"acc_stderr\": 0.0328264938530415,\n \"\
acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.0328264938530415\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361825,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361825\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \
\ \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844065,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n\
\ \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n\
\ \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456053,\n\
\ \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.0432076780753667,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.0432076780753667\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20051085568326948,\n\
\ \"acc_stderr\": 0.014317653708594207,\n \"acc_norm\": 0.20051085568326948,\n\
\ \"acc_norm_stderr\": 0.014317653708594207\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451163,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n\
\ \"acc_stderr\": 0.010813585552659674,\n \"acc_norm\": 0.23402868318122555,\n\
\ \"acc_norm_stderr\": 0.010813585552659674\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420126,\n\
\ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420126\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.37888278063696673,\n\
\ \"mc2_stderr\": 0.014137600334109192\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5430149960536701,\n \"acc_stderr\": 0.01400038676159829\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022545087\n }\n}\n```"
repo_url: https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|arc:challenge|25_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|gsm8k|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hellaswag|10_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T16-13-20.086955.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- '**/details_harness|winogrande|5_2024-01-22T16-13-20.086955.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T16-13-20.086955.parquet'
- config_name: results
data_files:
- split: 2024_01_22T16_13_20.086955
path:
- results_2024-01-22T16-13-20.086955.parquet
- split: latest
path:
- results_2024-01-22T16-13-20.086955.parquet
---
# Dataset Card for Evaluation run of LordNoah/Alpaca_refine_gpt2_e0_se1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/Alpaca_refine_gpt2_e0_se1](https://huggingface.co/LordNoah/Alpaca_refine_gpt2_e0_se1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T16:13:20.086955](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__Alpaca_refine_gpt2_e0_se1/blob/main/results_2024-01-22T16-13-20.086955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2707744494888737,
"acc_stderr": 0.031386153673103406,
"acc_norm": 0.272619145262433,
"acc_norm_stderr": 0.03218390044538922,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.37888278063696673,
"mc2_stderr": 0.014137600334109192
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313366,
"acc_norm": 0.29180887372013653,
"acc_norm_stderr": 0.013284525292403508
},
"harness|hellaswag|10": {
"acc": 0.36367257518422624,
"acc_stderr": 0.004800728138792374,
"acc_norm": 0.4534953196574388,
"acc_norm_stderr": 0.004968151878211051
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3283018867924528,
"acc_stderr": 0.028901593612411784,
"acc_norm": 0.3283018867924528,
"acc_norm_stderr": 0.028901593612411784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083287,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083287
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518753,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518753
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.0328264938530415,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.0328264938530415
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.034107802518361825,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.034107802518361825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844065,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456053,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20051085568326948,
"acc_stderr": 0.014317653708594207,
"acc_norm": 0.20051085568326948,
"acc_norm_stderr": 0.014317653708594207
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451163,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659674,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659674
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.02604066247420126,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.02604066247420126
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2571428571428571,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.2571428571428571,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.37888278063696673,
"mc2_stderr": 0.014137600334109192
},
"harness|winogrande|5": {
"acc": 0.5430149960536701,
"acc_stderr": 0.01400038676159829
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022545087
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LVN/photo | ---
license: openrail
---
|
open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b | ---
pretty_name: Evaluation run of jerryjalapeno/nart-100k-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jerryjalapeno/nart-100k-7b](https://huggingface.co/jerryjalapeno/nart-100k-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T00:09:38.629020](https://huggingface.co/datasets/open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b/blob/main/results_2023-09-23T00-09-38.629020.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09867869127516779,\n\
\ \"em_stderr\": 0.003054155613095924,\n \"f1\": 0.1502359479865761,\n\
\ \"f1_stderr\": 0.0031707924833711204,\n \"acc\": 0.3702237889195194,\n\
\ \"acc_stderr\": 0.008962759297749477\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.09867869127516779,\n \"em_stderr\": 0.003054155613095924,\n\
\ \"f1\": 0.1502359479865761,\n \"f1_stderr\": 0.0031707924833711204\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \
\ \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754763\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jerryjalapeno/nart-100k-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T00_09_38.629020
path:
- '**/details_harness|drop|3_2023-09-23T00-09-38.629020.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T00-09-38.629020.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T00_09_38.629020
path:
- '**/details_harness|gsm8k|5_2023-09-23T00-09-38.629020.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T00-09-38.629020.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:14:45.628566.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:14:45.628566.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T00_09_38.629020
path:
- '**/details_harness|winogrande|5_2023-09-23T00-09-38.629020.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T00-09-38.629020.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_14_45.628566
path:
- results_2023-07-24T11:14:45.628566.parquet
- split: 2023_09_23T00_09_38.629020
path:
- results_2023-09-23T00-09-38.629020.parquet
- split: latest
path:
- results_2023-09-23T00-09-38.629020.parquet
---
# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jerryjalapeno/nart-100k-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jerryjalapeno/nart-100k-7b](https://huggingface.co/jerryjalapeno/nart-100k-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T00:09:38.629020](https://huggingface.co/datasets/open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b/blob/main/results_2023-09-23T00-09-38.629020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.09867869127516779,
"em_stderr": 0.003054155613095924,
"f1": 0.1502359479865761,
"f1_stderr": 0.0031707924833711204,
"acc": 0.3702237889195194,
"acc_stderr": 0.008962759297749477
},
"harness|drop|3": {
"em": 0.09867869127516779,
"em_stderr": 0.003054155613095924,
"f1": 0.1502359479865761,
"f1_stderr": 0.0031707924833711204
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.00510610785374419
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754763
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lmqg/qa_harvesting_from_wikipedia_pseudo | ---
license: cc-by-4.0
pretty_name: Synthetic QA dataset.
language: en
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Dataset Card for "lmqg/qa_harvesting_from_wikipedia_pseudo"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a synthetic QA dataset generated with fine-tuned QG models over [`lmqg/qa_harvesting_from_wikipedia`](https://huggingface.co/datasets/lmqg/qa_harvesting_from_wikipedia), 1 million paragraph and answer pairs collected in [Du and Cardie, 2018](https://aclanthology.org/P18-1177/), made for question-answering based evaluation (QAE) for question generation model proposed by [Zhang and Bansal, 2019](https://aclanthology.org/D19-1253/).
The `train` split is the synthetic data and the `validation` split is the original validation set of [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/), where the model should be evaluate on.
This contains synthetic QA datasets created with the following QG models:
- [lmqg/bart-base-squad](https://huggingface.co/lmqg/bart-base-squad)
- [lmqg/bart-large-squad](https://huggingface.co/lmqg/bart-large-squad)
- [lmqg/t5-small-squad](https://huggingface.co/lmqg/t5-small-squad)
- [lmqg/t5-base-squad](https://huggingface.co/lmqg/t5-base-squad)
- [lmqg/t5-large-squad](https://huggingface.co/lmqg/t5-large-squad)
See more detail about the QAE at [https://github.com/asahi417/lm-question-generation/tree/master/misc/qa_based_evaluation](https://github.com/asahi417/lm-question-generation/tree/master/misc/emnlp_2022/qa_based_evaluation).
### Supported Tasks and Leaderboards
* `question-answering`
### Languages
English (en)
## Dataset Structure
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature of id
- `title`: a `string` feature of title of the paragraph
- `context`: a `string` feature of paragraph
- `question`: a `string` feature of question
- `answers`: a `json` feature of answers
### Data Splits
|train |validation|
|--------:|---------:|
|1,092,142| 10,570 |
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
zhensuuu/starcoderdata_100star_py | ---
dataset_info:
features:
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: id
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 3509121581
num_examples: 722003
download_size: 1297213342
dataset_size: 3509121581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "starcoderdata_100star_py"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisson40889/dubladore | ---
license: openrail
---
|
loubnabnl/prs-v2-sample | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pull_request.guid
dtype: string
- name: pull_request.code_review_events
dtype: string
- name: pull_request.events
dtype: string
- name: pull_request.issue_events
dtype: string
- name: bucket
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 201909231
num_examples: 10000
download_size: 38860265
dataset_size: 201909231
---
# Dataset Card for "prs-v2-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenghao/quora_questions | ---
license: other
dataset_info:
features:
- name: questions
dtype: string
splits:
- name: train
num_bytes: 51635953
num_examples: 808580
download_size: 31079310
dataset_size: 51635953
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
multilinguality:
- monolingual
pretty_name: Quora Questions
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- semantic-similarity-classification
paperswithcode_id: null
---
# Dataset Card for "quora"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://www.kaggle.com/c/quora-question-pairs](https://www.kaggle.com/c/quora-question-pairs)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 55.48 MB
- **Size of the generated dataset:** 55.46 MB
- **Total amount of disk used:** 110.94 MB
### Dataset Summary
The Quora dataset is composed of question pairs, and the task is to determine if the questions are paraphrases of each other (have the same meaning).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 55.48 MB
- **Size of the generated dataset:** 55.46 MB
- **Total amount of disk used:** 110.94 MB
### Data Fields
The data fields are the same among all splits.
### Data Splits
| name |train |
|-------|-----:|
|default|404290|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[Quora Term of Service](https://www.quora.com/about/tos), no commercial use.
### Citation Information
Unknown.
|
project-sloth/captcha-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: solution
dtype: string
splits:
- name: train
num_bytes: 24564698
num_examples: 6000
- name: validation
num_bytes: 8195367
num_examples: 2000
- name: test
num_bytes: 8186295
num_examples: 2000
download_size: 28857965
dataset_size: 40946360
license: wtfpl
task_categories:
- image-to-text
tags:
- captcha
- ocr
size_categories:
- 1K<n<10K
---
# Captcha dataset
## Data
Captcha images with solutions of exactly 6 digit numbers
## Splits
* Train: 6000 images
* Validation: 2000 images
* Test: 2000 images
## Example
 |
Blutkoete/docki-ap1 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 123464.0
num_examples: 2806
- name: test
num_bytes: 30888.0
num_examples: 702
download_size: 80525
dataset_size: 154352.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
LightFury9/CulturaX_500ksamples_8splits_te | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
- name: source
dtype: string
splits:
- name: split1
num_bytes: 470085184.10578954
num_examples: 62500
- name: split2
num_bytes: 470085184.10578954
num_examples: 62500
- name: split3
num_bytes: 470085184.10578954
num_examples: 62500
- name: split4
num_bytes: 470085184.10578954
num_examples: 62500
- name: split5
num_bytes: 470085184.10578954
num_examples: 62500
- name: split6
num_bytes: 470085184.10578954
num_examples: 62500
- name: split7
num_bytes: 470085184.10578954
num_examples: 62500
- name: split8
num_bytes: 470085184.10578954
num_examples: 62500
download_size: 1474381710
dataset_size: 3760681472.846317
configs:
- config_name: default
data_files:
- split: split1
path: data/split1-*
- split: split2
path: data/split2-*
- split: split3
path: data/split3-*
- split: split4
path: data/split4-*
- split: split5
path: data/split5-*
- split: split6
path: data/split6-*
- split: split7
path: data/split7-*
- split: split8
path: data/split8-*
---
|
techandy42/ppo-200K-collected-dataset-steps-500 | ---
dataset_info:
features:
- name: observation
sequence:
sequence:
sequence: float32
- name: action
sequence: int64
- name: reward
sequence: float32
- name: done
sequence: bool
splits:
- name: train
num_bytes: 176769250
num_examples: 1162000
download_size: 30970817
dataset_size: 176769250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wanicca/WikiHowQA-mnbvc | ---
license: mit
task_categories:
- question-answering
language:
- en
- zh
size_categories:
- 10K<n<100K
---
从WikiHow页面抽取的中文/英文问答数据
相关项目: [MNBVC](https://github.com/esbatmop/MNBVC)
抽取工具代码:[WikiHowQAExtractor](https://github.com/wanicca/WikiHowQAExtractor) |
HuggingFaceH4/instruct_me | ---
license: apache-2.0
dataset_info:
- config_name: instruction_tuning
features:
- name: text
dtype: string
- name: meta
struct:
- name: source
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 29975565
num_examples: 41685
- name: test
num_bytes: 3298059
num_examples: 4632
download_size: 18425612
dataset_size: 33273624
- config_name: reward_modelling
features:
- name: text
dtype: string
- name: meta
struct:
- name: source
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 25274204
num_examples: 41685
- name: test
num_bytes: 2777314
num_examples: 4632
download_size: 15636566
dataset_size: 28051518
- config_name: ppo
features:
- name: prompt
dtype: string
- name: meta
struct:
- name: source
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 50787070
num_examples: 83371
- name: test
num_bytes: 5715727
num_examples: 9264
download_size: 31461165
dataset_size: 56502797
- config_name: reward_modeling
features:
- name: prompt
dtype: string
- name: meta
struct:
- name: source
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 25274204
num_examples: 41685
- name: test
num_bytes: 2777314
num_examples: 4632
download_size: 15636838
dataset_size: 28051518
task_categories:
- conversational
- text-generation
language:
- en
tags:
- human-feedback
- instruct
- reward-modeling
pretty_name: Instruct Me
---
# Dataset card for Instruct Me
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** Lewis Tunstall
### Dataset summary
Instruct Me is a dataset of prompts and instruction dialogues between a human user and AI assistant. The prompts are derived from (prompt, completion) pairs in the [Helpful Instructions dataset](https://huggingface.co/datasets/HuggingFaceH4/helpful_instructions). The goal is to train a language model to that is "chatty" and can answer the kind of questions or tasks a human user might instruct an AI assistant to perform.
### Supported Tasks and Leaderboard
We provide 3 configs that can be used for training RLHF models:
#### instruction_tuning
Single-turn user/bot dialogues for instruction tuning.
#### reward_modeling
Prompts to generate model completions and collect human preference data
#### ppo
Prompts to generate model completions for optimization of the instruction-tuned model with techniques like PPO.
### Changelog
* March 6, 2023: `v1.1.0` release. Changed the `text` columns for the `reward_modeling` and `ppo` configs to `prompt` for consistency with our dataset schemas elsewhere.
* March 5, 2023: `v1.0.0` release. |
miraeconan/network-data | ---
license: cc0-1.0
---
|
paulinaaguilera/trabajo_final2 | ---
license: apache-2.0
---
|
Posos/MedNERF | ---
license: cc-by-nc-sa-4.0
task_categories:
- token-classification
language:
- fr
tags:
- medical
pretty_name: MedNERF
size_categories:
- n<1K
---
# MedNERF
## Dataset Description
- **Paper:** [Multilingual Clinical NER: Translation or Cross-lingual Transfer?](https://arxiv.org/abs/2306.04384)
- **Point of Contact:** [email](research@posos.fr)
### Dataset Summary
MedNERF is a French medical NER dataset whose aim is to serve as a test set for medical NER models.
It has been built using a sample of French medical prescriptions annotated with the same guidelines as the [n2c2 dataset](https://academic.oup.com/jamia/article-abstract/27/1/3/5581277?redirectedFrom=fulltext&login=false).
Entities are annotated with the following labels: `Drug`, `Strength`, `Form`, `Dosage`, `Duration` and `Frequency`, using the IOB format.
## Licensing Information
This dataset is distributed under the Creative Commons Attribution Non Commercial Share Alike 4.0 license.
## Citation information
```
@inproceedings{mednerf,
title = "Multilingual Clinical NER: Translation or Cross-lingual Transfer?",
author = "Gaschi, Félix and Fontaine, Xavier and Rastin, Parisa and Toussaint, Yannick",
booktitle = "Proceedings of the 5th Clinical Natural Language Processing Workshop",
publisher = "Association for Computational Linguistics",
year = "2023"
}
``` |
mayerantoine/injury-narrative-coding | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- medical
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:** https://github.com/NASA-Tournament-Lab/CDC-NLP-Occ-Injury-Coding
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Dataset used in 2018 NASA-Tournament-Lab organized in collaboration with National Institute for Occupational Safety and Health (NIOSH),
The goal of the competition was to classify unstructured free-text “injury narratives”
recorded in surveillance systems into injury codes from the Occupational Injuries and Illnesses Classification System (OIICS)
### Supported Tasks and Leaderboards
This is a multi-class text classification task.The training dataset includes 48 classifiable event codes distributed across 7 categories:
* Violence and other injuries by persons and animals
* Transportation incidents
* Fires and explosions
* Falls, slips, and trips
* Exposure to harmful substances or environments
* Contact with objects and equipment
* Overexertion and bodily reaction
For example the text 'DOING UNSPECIFIED LIFTING AT WORK AND DEVELOPED PAIN ACROSS CHEST CHEST PAIN' is coded by 71 which means 'Overexertion involving outside sources'
More details on the categories and event codes can be found at: https://wwwn.cdc.gov/wisards/oiics/Trees/MultiTree.aspx?TreeType=Event
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
* text: injury narrative description
* sex : sex
* age : age
* event : OIICS code, which is the classfication label
### Data Splits
| name |train|test|full_dataset |
|----------|----:|-----------:|----:|
|plain_text|153,956| 75,864|229,820|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
philschmid/zephyr-7b-beta-eval-dolly-15k | ---
dataset_info:
features:
- name: input
dtype: string
- name: ref_completion
dtype: string
- name: generations
dtype: string
splits:
- name: train
num_bytes: 166673
num_examples: 100
download_size: 110903
dataset_size: 166673
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tamilmixsentiment | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
- ta
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: Tamilmixsentiment
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Positive
'1': Negative
'2': Mixed_feelings
'3': unknown_state
'4': not-Tamil
splits:
- name: train
num_bytes: 790132
num_examples: 11335
- name: validation
num_bytes: 89618
num_examples: 1260
- name: test
num_bytes: 218764
num_examples: 3149
download_size: 1150792
dataset_size: 1098514
---
# Dataset Card for Tamilmixsentiment
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Tamilmixsentiment Homepage](https://dravidian-codemix.github.io/2020/index.html)
- **Repository:** [Tamilmixsentiment repository](https://dravidian-codemix.github.io/2020/datasets.html)
- **Paper:** [Corpus Creation for Sentiment Analysis in Code-Mixed Tamil-English Text](https://www.aclweb.org/anthology/2020.sltu-1.28/)
- **Leaderboard:** [Rank list](https://drive.google.com/file/d/1Mf8-No-63koGRwdF13RrO01NAFBlNmI0/view?usp=sharing)
- **Point of Contact:** [Bharathi Raja Chakravarthi](mailto:bharathiraja.akr@gmail.com)
### Dataset Summary
The first gold standard Tamil-English code-switched, sentiment-annotated corpus containing 15,744 comment posts from YouTube. This makes the largest general domain sentiment dataset for this relatively low-resource language with code-mixing phenomenon. The comment/post may contain more than one sentence but the average sentence length of the corpora is 1. Each comment/post is annotated with sentiment polarity at the comment/post level. This dataset also has class imbalance problems depicting real-world scenarios.
### Supported Tasks and Leaderboards
To identify sentiment polarity of the code-mixed dataset of comments/posts in Tamil-English collected from social media.
### Languages
Tamil-English code-switched. The dataset contains all the three types of code-mixed sentences - Inter-Sentential switch, Intra-Sentential switch and Tag switching. Most comments were written in Roman script with either Tamil grammar with English lexicon or English grammar with Tamil lexicon. Some comments were written in Tamil script with English expressions in between.
## Dataset Structure
### Data Instances
An example from the Tamilmixsentiment train set looks as follows:
```
text label
Trailer late ah parthavanga like podunga Positive
```
### Data Fields
- `text`: Tamil-English code-mixed comment.
- `label`: list of the possible sentiments "Positive", "Negative", "Mixed_feelings", "unknown_state", "not-Tamil"
### Data Splits
The entire dataset of 15,744 sentences was randomly shuffled and split into three parts as follows:
| | train | validation | test |
|------------------------------|------:|-----------:|-----:|
| Tamilmixsentiment | 11335 | 1260 | 3149 |
## Dataset Creation
### Curation Rationale
Sentiment analysis has become important in social media research (Yang and Eisenstein, 2017). Until recently these applications were created for high-resourced languages which analysed monolingual utterances. But social media in multilingual communities contains more code-mixed text. Code-mixing is common among speakers in a bilingual speech community. As English is seen as the language of prestige and education, the influence of lexicon, connectives and phrases from English language is common in spoken Tamil. Tamil has little annotated data for code-mixed scenarios. An annotated corpus developed for monolingual data cannot deal with code-mixed usage and therefore it fails to yield good results due to mixture of languages at different levels of linguistic analysis. Therefore this dataset of code-mixed Tamil-English sentiment annotated corpus is created.
### Source Data
#### Initial Data Collection and Normalization
The data was scraped from Youtube. In total 184,573 sentences for Tamil from YouTube comments from the trailers of a movies released in 2019. Many of the them contained sentences
that were either entirely written in English or code-mixed Tamil-English or fully written in Tamil. So we filtered out a non-code-mixed corpus based on language identification
at comment level using the langdetect library. The comment is written fully in Tamil or English, we discarded that comment since monolingual resources are available for these languages. We also identified if the sentences were written in other languages such as Hindi, Malayalam, Urdu, Telugu, and Kannada. We preprocessed the comments by removing the emoticons and applying a sentence
length filter. We want to create a code-mixed corpus of reasonable size with sentences that have fairly defined sentiments which will be useful for future research. Thus our filter removed sentences with less than five words and more than 15 words after cleaning the data. In the end we got 15,744 Tanglish sentences.
#### Who are the source language producers?
Youtube users
### Annotations
#### Annotation process
Three steps complete the annotation setup. First, each sentence was annotated by two people. In the second step, the data were collected if both of them agreed. In the case of conflict, a third person annotated the sentence. In the third step, if all the three of them did not agree, then two more annotators annotated the sentences.
#### Who are the annotators?
Eleven volunteers were involved in the process. All of them were native speakers of Tamil with diversity in gender, educational level and medium of instruction in their school education.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{chakravarthi-etal-2020-corpus,
title = "Corpus Creation for Sentiment Analysis in Code-Mixed {T}amil-{E}nglish Text",
author = "Chakravarthi, Bharathi Raja and
Muralidaran, Vigneshwaran and
Priyadharshini, Ruba and
McCrae, John Philip",
booktitle = "Proceedings of the 1st Joint Workshop on Spoken Language Technologies for Under-resourced languages (SLTU) and Collaboration and Computing for Under-Resourced Languages (CCURL)",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources association",
url = "https://www.aclweb.org/anthology/2020.sltu-1.28",
pages = "202--210",
abstract = "Understanding the sentiment of a comment from a video or an image is an essential task in many applications. Sentiment analysis of a text can be useful for various decision-making processes. One such application is to analyse the popular sentiments of videos on social media based on viewer comments. However, comments from social media do not follow strict rules of grammar, and they contain mixing of more than one language, often written in non-native scripts. Non-availability of annotated code-mixed data for a low-resourced language like Tamil also adds difficulty to this problem. To overcome this, we created a gold standard Tamil-English code-switched, sentiment-annotated corpus containing 15,744 comment posts from YouTube. In this paper, we describe the process of creating the corpus and assigning polarities. We present inter-annotator agreement and show the results of sentiment analysis trained on this corpus as a benchmark.",
language = "English",
ISBN = "979-10-95546-35-1",
}
```
### Contributions
Thanks to [@jamespaultg](https://github.com/jamespaultg) for adding this dataset. |
DavidCombei/Wav2Vec_MELD_Audio | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype: int64
splits:
- name: train
num_bytes: 865227517.2
num_examples: 9988
- name: validation
num_bytes: 101715665.292
num_examples: 1108
- name: test
num_bytes: 249974554.32
num_examples: 2608
download_size: 1108245106
dataset_size: 1216917736.812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- feature-extraction
language:
- en
size_categories:
- 10K<n<100K
--- |
open-llm-leaderboard/details_Kukedlc__Neural-4-ARC-7b | ---
pretty_name: Evaluation run of Kukedlc/Neural-4-ARC-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Neural-4-ARC-7b](https://huggingface.co/Kukedlc/Neural-4-ARC-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Neural-4-ARC-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T16:17:20.581144](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-4-ARC-7b/blob/main/results_2024-03-30T16-17-20.581144.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554184265790993,\n\
\ \"acc_stderr\": 0.03197936283986077,\n \"acc_norm\": 0.6549216878146381,\n\
\ \"acc_norm_stderr\": 0.032646649534862626,\n \"mc1\": 0.598531211750306,\n\
\ \"mc1_stderr\": 0.01716027390169366,\n \"mc2\": 0.7411577792308015,\n\
\ \"mc2_stderr\": 0.01443386393149793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927106\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7172873929496116,\n\
\ \"acc_stderr\": 0.004493975527386738,\n \"acc_norm\": 0.8904600677155945,\n\
\ \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.598531211750306,\n\
\ \"mc1_stderr\": 0.01716027390169366,\n \"mc2\": 0.7411577792308015,\n\
\ \"mc2_stderr\": 0.01443386393149793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.012705685723131714\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Neural-4-ARC-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-20.581144.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-17-20.581144.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- '**/details_harness|winogrande|5_2024-03-30T16-17-20.581144.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T16-17-20.581144.parquet'
- config_name: results
data_files:
- split: 2024_03_30T16_17_20.581144
path:
- results_2024-03-30T16-17-20.581144.parquet
- split: latest
path:
- results_2024-03-30T16-17-20.581144.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Neural-4-ARC-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Neural-4-ARC-7b](https://huggingface.co/Kukedlc/Neural-4-ARC-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Neural-4-ARC-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T16:17:20.581144](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-4-ARC-7b/blob/main/results_2024-03-30T16-17-20.581144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554184265790993,
"acc_stderr": 0.03197936283986077,
"acc_norm": 0.6549216878146381,
"acc_norm_stderr": 0.032646649534862626,
"mc1": 0.598531211750306,
"mc1_stderr": 0.01716027390169366,
"mc2": 0.7411577792308015,
"mc2_stderr": 0.01443386393149793
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927106
},
"harness|hellaswag|10": {
"acc": 0.7172873929496116,
"acc_stderr": 0.004493975527386738,
"acc_norm": 0.8904600677155945,
"acc_norm_stderr": 0.003116771577319422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223144,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223144
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.598531211750306,
"mc1_stderr": 0.01716027390169366,
"mc2": 0.7411577792308015,
"mc2_stderr": 0.01443386393149793
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065592
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131714
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sfblaauw/cofrico-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 266372.6
num_examples: 3
- name: test
num_bytes: 122883.2
num_examples: 1
- name: validation
num_bytes: 42102.2
num_examples: 1
download_size: 425753
dataset_size: 431358.0
---
# Dataset Card for "cofrico-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MrOvkill/combined-svg | ---
license: apache-2.0
---
# NOTE: UNDER DEVELOPMENT. UP FOR STORAGE/COORDINATION. DO NOT USE UNTIL FURTHER NOTICE.
# Presynthion - Combined SVG
Combined SVG will be a dataset composed of renders, detailed text descriptions, captions, and stable diffusion prompts for all of the listed images in the datasets used.
Technologies Used ( Primarily ):
* MC-LLAVA-3b
* uform-gen2-qwen-500m
* gemma-2b-it
* cairosvg
* imagemagick
* transformers
* hf datasets
Services Used:
* Runpod.io ( Server Time )
* Together Computer Inference ( LLM Inference )
* My Own Desktop ( Combination, Research, Development )
==============================
|
moranyanuka/OpenCHAIR | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: test
num_bytes: 309448153
num_examples: 2000
download_size: 307824866
dataset_size: 309448153
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# OpenCHAIR
The official OpenCHAIR dataset, introduced in [MOCHa: Multi-Objective Reinforcement Mitigating Caption Hallucinations](https://arxiv.org/pdf/2312.03631.pdf)
[Project Page](https://assafbk.github.io/mocha/)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("moranyanuka/OpenCHAIR")['test']
```
To evaluate hallucinations of VLMs on OpenCHAIR, please refer to our [GitHub repo](https://github.com/assafbk/mocha_code).
bibtex:
```
@misc{benkish2024mitigating,
title={Mitigating Open-Vocabulary Caption Hallucinations},
author={Assaf Ben-Kish and Moran Yanuka and Morris Alper and Raja Giryes and Hadar Averbuch-Elor},
year={2024},
eprint={2312.03631},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
fxmeng/alpaca_in_mixtral_format | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 20350215
num_examples: 52002
download_size: 11773876
dataset_size: 20350215
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "alpaca_in_mixtral_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 21410348
num_examples: 2510
download_size: 5471823
dataset_size: 21410348
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deokhk/ko_wiki_sentences_1000000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 175952266
num_examples: 1000000
- name: dev
num_bytes: 166735
num_examples: 1000
download_size: 106344214
dataset_size: 176119001
---
# Dataset Card for "ko_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/snips_test_valid_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 104890744
num_examples: 22400
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 104890744
num_examples: 22400
- name: academicodec_hifi_24k_320d
num_bytes: 156911640
num_examples: 22400
- name: audiodec_24k_320d
num_bytes: 335215352
num_examples: 22400
- name: dac_16k
num_bytes: 329541496
num_examples: 22400
- name: dac_24k
num_bytes: 1316239608
num_examples: 22400
- name: dac_44k
num_bytes: 425937832
num_examples: 22400
- name: encodec_24k_12bps
num_bytes: 627940216
num_examples: 22400
- name: encodec_24k_1_5bps
num_bytes: 79225672
num_examples: 22400
- name: encodec_24k_24bps
num_bytes: 1255042552
num_examples: 22400
- name: encodec_24k_3bps
num_bytes: 157613464
num_examples: 22400
- name: encodec_24k_6bps
num_bytes: 314389048
num_examples: 22400
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 838995192
num_examples: 22400
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 838995192
num_examples: 22400
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 838875384
num_examples: 22400
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 422686712
num_examples: 22400
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 838875384
num_examples: 22400
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 422686712
num_examples: 22400
- name: speech_tokenizer_16k
num_bytes: 210347256
num_examples: 22400
download_size: 1509323889
dataset_size: 9619300200
---
# Dataset Card for "snips_test_valid_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
startificial/nlp-getting-started | ---
license: openrail
---
|
joey234/mmlu-global_facts-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 18801
num_examples: 100
download_size: 11161
dataset_size: 18801
---
# Dataset Card for "mmlu-global_facts-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MattGPT/Krown | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 164094217.0
num_examples: 62
download_size: 153382874
dataset_size: 164094217.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lshowway/wikipedia.reorder.ovs.pl | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1958124685
num_examples: 1772445
download_size: 547217506
dataset_size: 1958124685
---
# Dataset Card for "wikipedia.reorder.ovs.pl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gilf/avyakti-parivar-json | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 329434
num_examples: 2789
download_size: 89515
dataset_size: 329434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/bearn_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bearn/ベアルン/贝亚恩 (Azur Lane)
This is the dataset of bearn/ベアルン/贝亚恩 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `bangs, breasts, small_breasts, multicolored_hair, short_hair, horns, black_hair, glasses, grey_hair, blunt_bangs, grey_eyes, purple_hair, streaked_hair, two-tone_hair, blue_eyes, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 14.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 8.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 14.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 12.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 20.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bearn_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, monocle, bare_shoulders, holding, simple_background, black_gloves, blush, closed_mouth, covered_navel, long_sleeves, thighhighs, dress, full_body, jacket, off_shoulder, swimsuit, thigh_boots, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | monocle | bare_shoulders | holding | simple_background | black_gloves | blush | closed_mouth | covered_navel | long_sleeves | thighhighs | dress | full_body | jacket | off_shoulder | swimsuit | thigh_boots | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------|:-----------------|:----------|:--------------------|:---------------|:--------|:---------------|:----------------|:---------------|:-------------|:--------|:------------|:---------|:---------------|:-----------|:--------------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
GEM/Elongated_CACAPO_for_E2E | ---
task_categories:
- text-generation
language:
- nl
- en
tags:
- E2E
- Dutch
- English
- Reverse Engineered
- RDF to Sentence
pretty_name: CACAPO_elongated_training
size_categories:
- 10K<n<100K
license: cc-by-4.0
---
Dataset information can be found in the JSON file named "elongated_training_cacapo_updated-02_22_2023_23_23_20.json", which was created with the interactive dataset creator provided by Huggingface. |
CyberHarem/shokuhou_misaki_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shokuhou_misaki/食蜂操祈/食蜂操祈 (Blue Archive)
This is the dataset of shokuhou_misaki/食蜂操祈/食蜂操祈 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, symbol-shaped_pupils, sparkling_eyes, yellow_eyes, large_breasts, star-shaped_pupils, hair_between_eyes, brown_eyes, bright_pupils, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 702.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shokuhou_misaki_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 607.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shokuhou_misaki_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1232 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shokuhou_misaki_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shokuhou_misaki_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, elbow_gloves, gym_shirt, gym_shorts, gym_uniform, looking_at_viewer, sleeveless_shirt, solo, star_(symbol), white_gloves, white_shirt, bare_shoulders, smile, white_shorts, handbag, short_shorts, holding_remote_control, blush, school_emblem, white_background, white_thighhighs, closed_mouth, simple_background, strap_between_breasts, white_pupils, yellow_pupils |
| 1 | 8 |  |  |  |  |  | 1girl, elbow_gloves, gym_shirt, gym_shorts, gym_uniform, looking_at_viewer, sleeveless_shirt, star_(symbol), white_gloves, white_shirt, white_shorts, bare_shoulders, school_emblem, short_shorts, solo, white_thighhighs, simple_background, white_background, blush, lifted_by_self, navel, open_mouth, shirt_lift, smile, spider_web_print, tongue_out, white_pupils |
| 2 | 8 |  |  |  |  |  | 1girl, blazer, brown_jacket, long_sleeves, plaid_skirt, pleated_skirt, red_bowtie, solo, star_(symbol), tokiwadai_school_uniform, white_gloves, winter_uniform, collared_shirt, handbag, looking_at_viewer, school_emblem, smile, white_shirt, white_thighhighs, yellow_pupils, closed_mouth, zettai_ryouiki, blush, cowboy_shot, one_eye_closed, simple_background, spider_web_print, white_background, hand_on_own_hip, holding_remote_control, loafers, standing |
| 3 | 7 |  |  |  |  |  | 1girl, blazer, looking_at_viewer, pleated_skirt, solo, spider_web_print, tokiwadai_school_uniform, white_thighhighs, winter_uniform, zettai_ryouiki, plaid_skirt, smile, star_(symbol), white_gloves, handbag, bow, chain, open_mouth, remote_control |
| 4 | 6 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, pleated_skirt, remote_control, short_sleeves, smile, solo, star_(symbol), summer_uniform, sweater_vest, tokiwadai_school_uniform, white_gloves, white_thighhighs, zettai_ryouiki, chain, handbag, spider_web_print |
| 5 | 5 |  |  |  |  |  | 1girl, chain, elbow_gloves, handbag, looking_at_viewer, pleated_skirt, remote_control, short_sleeves, smile, solo, spider_web_print, star_(symbol), summer_uniform, sweater_vest, tokiwadai_school_uniform, white_gloves, white_thighhighs, zettai_ryouiki, blush, impossible_clothes, strap_between_breasts, v-neck, loafers, sitting |
| 6 | 10 |  |  |  |  |  | 1girl, brown_sweater_vest, collared_shirt, elbow_gloves, looking_at_viewer, pleated_skirt, short_sleeves, simple_background, solo, star_(symbol), summer_uniform, tokiwadai_school_uniform, white_gloves, white_shirt, white_thighhighs, smile, white_background, closed_mouth, school_emblem, zettai_ryouiki, cowboy_shot, grey_skirt, blush, handbag, white_pupils, hand_on_own_hip, spider_web_print |
| 7 | 6 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, remote_control, solo, star_(symbol), summer_uniform, sweater_vest, tokiwadai_school_uniform, white_gloves, smile, upper_body, short_sleeves |
| 8 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nude, sex, spider_web_print, vaginal, white_thighhighs, elbow_gloves, mosaic_censoring, nipples, open_mouth, penis, solo_focus, sweat, white_gloves, navel, on_back, missionary, spread_legs, thighs, cum_in_pussy, looking_at_viewer, bed_sheet, closed_eyes, collarbone, on_bed, pov |
| 9 | 5 |  |  |  |  |  | 1girl, cleavage, elbow_gloves, looking_at_viewer, navel, smile, solo, star_(symbol), white_gloves, white_thighhighs, blush, holding_remote_control, one_eye_closed, spider_web_print, underwear_only, white_pupils, ;), armpits, arms_up, pink_bra, pink_panties, side-tie_bikini_bottom, simple_background, stomach, thighs, white_background, yellow_bikini |
| 10 | 6 |  |  |  |  |  | 1girl, star_(symbol), black_bowtie, cleavage, detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, simple_background, smile, solo, strapless_leotard, bare_shoulders, elbow_gloves, high_heels, pantyhose, rabbit_tail, white_gloves, black_footwear, black_leotard, blush, highleg_leotard, pink_halo, sideboob, star_halo, thigh_strap, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | gym_shirt | gym_shorts | gym_uniform | looking_at_viewer | sleeveless_shirt | solo | star_(symbol) | white_gloves | white_shirt | bare_shoulders | smile | white_shorts | handbag | short_shorts | holding_remote_control | blush | school_emblem | white_background | white_thighhighs | closed_mouth | simple_background | strap_between_breasts | white_pupils | yellow_pupils | lifted_by_self | navel | open_mouth | shirt_lift | spider_web_print | tongue_out | blazer | brown_jacket | long_sleeves | plaid_skirt | pleated_skirt | red_bowtie | tokiwadai_school_uniform | winter_uniform | collared_shirt | zettai_ryouiki | cowboy_shot | one_eye_closed | hand_on_own_hip | loafers | standing | bow | chain | remote_control | short_sleeves | summer_uniform | sweater_vest | impossible_clothes | v-neck | sitting | brown_sweater_vest | grey_skirt | upper_body | 1boy | hetero | nude | sex | vaginal | mosaic_censoring | nipples | penis | solo_focus | sweat | on_back | missionary | spread_legs | thighs | cum_in_pussy | bed_sheet | closed_eyes | collarbone | on_bed | pov | cleavage | underwear_only | ;) | armpits | arms_up | pink_bra | pink_panties | side-tie_bikini_bottom | stomach | yellow_bikini | black_bowtie | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | high_heels | pantyhose | rabbit_tail | black_footwear | black_leotard | highleg_leotard | pink_halo | sideboob | star_halo | thigh_strap |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:------------|:-------------|:--------------|:--------------------|:-------------------|:-------|:----------------|:---------------|:--------------|:-----------------|:--------|:---------------|:----------|:---------------|:-------------------------|:--------|:----------------|:-------------------|:-------------------|:---------------|:--------------------|:------------------------|:---------------|:----------------|:-----------------|:--------|:-------------|:-------------|:-------------------|:-------------|:---------|:---------------|:---------------|:--------------|:----------------|:-------------|:---------------------------|:-----------------|:-----------------|:-----------------|:--------------|:-----------------|:------------------|:----------|:-----------|:------|:--------|:-----------------|:----------------|:-----------------|:---------------|:---------------------|:---------|:----------|:---------------------|:-------------|:-------------|:-------|:---------|:-------|:------|:----------|:-------------------|:----------|:--------|:-------------|:--------|:----------|:-------------|:--------------|:---------|:---------------|:------------|:--------------|:-------------|:---------|:------|:-----------|:-----------------|:-----|:----------|:----------|:-----------|:---------------|:-------------------------|:----------|:----------------|:---------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:-------------|:------------|:--------------|:-----------------|:----------------|:------------------|:------------|:-----------|:------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | | | X | | X | X | X | X | | X | | X | | X | X | X | X | X | X | X | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | | X | X | X | | | X | | X | | | | | | X | | | | | | | | X | | X | | X | | | X | X | | X | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | | | X | | X | X | X | | | X | | X | | | | | | X | | | | | | | | | | X | | | | | | X | | X | | | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | | | X | | X | X | X | | | X | | X | | | X | | | X | | | X | | | | | | | X | | | | | | X | | X | | | X | | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | | | | X | | X | X | X | X | | X | | X | | | X | X | X | X | X | X | | X | | | | | | X | | | | | | X | | X | | X | X | X | | X | | | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | | | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | X | | | | X | | | | X | | | | | | | | X | | | X | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | | | X | | X | X | X | | | X | | | | X | X | | X | X | | X | | X | | | X | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | | | | X | | X | X | X | | X | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
vikp/evol_instruct_v2_filtered_109k | ---
dataset_info:
features:
- name: idx
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: rendered
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 512830593.9343947
num_examples: 109797
download_size: 252022478
dataset_size: 512830593.9343947
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "evol_instruct_v2_filtered_109k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaxYuki/Lily-Angel | ---
license: apache-2.0
---
|
csaybar/supersat | ---
license: mit
---
|
NobodyExistsOnTheInternet/500correct | ---
license: mit
---
|
andersonbcdefg/bioasq_beir_generated_queries_pairs | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
splits:
- name: train
num_bytes: 15633776003
num_examples: 12073035
download_size: 8852826029
dataset_size: 15633776003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alexjercan/bugnet | ---
dataset_info:
- config_name: Python
features:
- name: problem_id
dtype: string
- name: language
dtype: string
- name: original_status
dtype: string
- name: fail
dtype: string
- name: pass
dtype: string
- name: change
dtype: string
- name: i1
dtype: uint32
- name: i2
dtype: uint32
- name: j1
dtype: uint32
- name: j2
dtype: uint32
- name: error
dtype: string
- name: stderr
dtype: string
- name: stdout
dtype: string
- name: description
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8237153
num_examples: 2557
- name: validation
num_bytes: 3497872
num_examples: 1105
- name: test
num_bytes: 205241
num_examples: 100
download_size: 19290233
dataset_size: 11940266
- config_name: C++
features:
- name: problem_id
dtype: string
- name: language
dtype: string
- name: original_status
dtype: string
- name: fail
dtype: string
- name: pass
dtype: string
- name: change
dtype: string
- name: i1
dtype: uint32
- name: i2
dtype: uint32
- name: j1
dtype: uint32
- name: j2
dtype: uint32
- name: error
dtype: string
- name: stderr
dtype: string
- name: stdout
dtype: string
- name: description
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 482930200
num_examples: 68621
- name: validation
num_bytes: 1129323
num_examples: 125
- name: test
num_bytes: 40048505
num_examples: 4769
download_size: 378900920
dataset_size: 524108028
---
# About the Dataset
The source code used to generate the dataset can be found on
[GitHub](https://github.com/alexjercan/bug-detection/tree/master/bugnet)
The dataset is based on the [CodeNet project](https://github.com/IBM/Project_CodeNet)
and contains Python and C++ code submissions for online coding competitions. The data
is obtained by selecting consecutive attempts of a single user that resulted in fixing a
buggy submission. Thus the data is represented by code pairs and annotated by the diff
and error of each changed instruction. We have already tokenized all the source code
files and kept the same format as in the original dataset.
The upgrade made compared to CodeNetPy is that we only keep one line errors.
This means that the task of bug detection and repair will be easier to manage.
We also removed all the files that fail on linters, so that we are focusing only
on bugs that cannot be identified easily.
The resulting dataset file will be a csv with the following columns:
- `problem_id`: The id of the problem, matches with the id from Project_CodeNet
- `language`: The programming language of the submission (`Python` or `C++`)
- `original_status`: The status of the initial submission (`TLE`, `MLE`, anything that is not `Accepted`)
- `fail`: The initial (buggy) source code formatted (`black` or `clang-fromat`)
- `pass`: The modified (accepted) source code formatted(`black` or `clang-format`
- `change`: The change that was made (`replace`, `insert`, `delete`)
- `i1`: Start of the change in the buggy source (the line; starting with 1)
- `i2`: End of the change in the buggy source (not inclusive; for `insert` we have `i1 == i2`)
- `j1`: Start of the change in the accepted source (the line; starting with 1)
- `j2`: End of the change in the accepted source (not inclusive; for `delete` we have `j1 == j2`)
- `error`: The error that was obtained running the buggy source code on the input/output examples
- `stderr`: The full output of stderr of running the buggy source code on the input/output examples
- `stdout`: The full output of stdout of running the buggy source code on the input/output examples
- `description`: The problem statement in html format
- `input`: The input for the test case
- `output`: The output for the test case
|
anaili/test-test | ---
license: apache-2.0
---
|
Malvinan/bloom_vpg_pretraining | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: language
dtype: string
- name: image_list
sequence: string
- name: annotations
sequence: string
- name: input_token_ids
sequence:
sequence: int64
- name: output_token_ids
sequence:
sequence: int64
splits:
- name: train
num_bytes: 4928751680
num_examples: 5068744
- name: validation
num_bytes: 196469776
num_examples: 194954
download_size: 633120669
dataset_size: 5125221456
---
# Dataset Card for "bloom_vpg_pretraining"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboussarov/llm-try | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2196528.0
num_examples: 268
- name: test
num_bytes: 245880.0
num_examples: 30
download_size: 1128058
dataset_size: 2442408.0
---
# Dataset Card for "llm-try"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713152094 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 5548
num_examples: 15
download_size: 9825
dataset_size: 5548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713152094"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trooaditya/fashion_accessories_dataset_all | ---
dataset_info:
features:
- name: image (image)
dtype: image
- name: text (string)
dtype: string
splits:
- name: train
num_bytes: 114814464.224
num_examples: 11274
download_size: 81084268
dataset_size: 114814464.224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CATIE-AQ/squad_v2_french_translated_fr_prompt_context_generation_with_answer | ---
language:
- fr
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- squad_v2_french_translated
---
# squad_v2_french_translated_fr_prompt_context_generation_with_answer
## Summary
**squad_v2_french_translated_fr_prompt_context_generation_with_answer** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **1,271,928** rows that can be used for a context-generation (with answer) task.
The original data (without prompts) comes from the dataset [pragnakalp/squad_v2_french_translated](https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Étant donné la réponse "'+ answer+'", écrire un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", écris un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", écrivez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédiger un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédige un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédigez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", générer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", génère un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", générez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", créer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", crée un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", créez un texte explicatif.\nTexte : ',
'Ecrire un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Ecris un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Ecrivez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédiger un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédige un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédigez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Générer un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Génère un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Générez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Créer un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Crée un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Créez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
```
# Splits
- `train` with 1,271,928 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/squad_v2_french_translated_fr_prompt_context_generation_with_answer")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
apache-2.0 |
nhantruongcse/100k_train_large_vietnews | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
splits:
- name: train
num_bytes: 374682895.7903489
num_examples: 100000
download_size: 199533407
dataset_size: 374682895.7903489
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vulpeka/autotrain-data-psupportai | ---
language:
- en
---
# AutoTrain Dataset for project: psupportai
## Dataset Description
This dataset has been automatically processed by AutoTrain for project psupportai.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "Account recovery",
"question": "How can I recover my LoL account",
"answers.text": [
"You need to contact with Player Support, you need to have everything ready to use "
],
"answers.answer_start": [
1
]
},
{
"context": "Billing issue",
"question": "I can't purchase RP with my credit card",
"answers.text": [
"You need to contact with Player Support, you need to have everything ready to use "
],
"answers.answer_start": [
2
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2 |
| valid | 1 |
|
bot-yaya/undl_zh2en_aligned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: record
dtype: string
- name: clean_para_index_set_pair
dtype: string
- name: src
dtype: string
- name: dst
dtype: string
- name: src_text
dtype: string
- name: dst_text
dtype: string
- name: src_rate
dtype: float64
- name: dst_rate
dtype: float64
splits:
- name: train
num_bytes: 8884444751
num_examples: 15331650
download_size: 2443622169
dataset_size: 8884444751
---
# Dataset Card for "undl_zh2en_aligned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_acomp_focusing_like | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 248924
num_examples: 533
- name: train
num_bytes: 239368
num_examples: 487
download_size: 322334
dataset_size: 488292
---
# Dataset Card for "MULTI_VALUE_rte_acomp_focusing_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bluckr/function-calling-assistant-spanish-pofi-v2 | ---
license: other
license_name: uso-libre
license_link: LICENSE
---
|
pawlo2013/kimchi-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': baechu
'1': baik
'2': boochoo
'3': chongkak
'4': got
'5': kkakdoogi
'6': moosaengchae
'7': nabak
'8': ohyeesobaki
'9': pa
'10': yeolmoo
splits:
- name: train
num_bytes: 14601643.5
num_examples: 7700
- name: validation
num_bytes: 2080923.6
num_examples: 1100
- name: test
num_bytes: 4160510.8
num_examples: 2200
download_size: 15508346
dataset_size: 20843077.9
---
# Dataset Card for "kimchi-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mariosasko/test | ---
dataset_info:
features:
- name: foo
dtype: int64
splits:
- name: train
num_bytes: 24
num_examples: 3
download_size: 842
dataset_size: 24
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dresen/common_voice_13_0_da_pseudo_labelled | ---
dataset_info:
config_name: da
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 70558926.204
num_examples: 2746
- name: validation
num_bytes: 70260246.264
num_examples: 2222
- name: test
num_bytes: 63222825.0
num_examples: 2160
download_size: 195972937
dataset_size: 204041997.468
configs:
- config_name: da
data_files:
- split: train
path: da/train-*
- split: validation
path: da/validation-*
- split: test
path: da/test-*
---
|
freshpearYoon/v3_train_free_1 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15366802344
num_examples: 10000
download_size: 2524589101
dataset_size: 15366802344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sanjay920/code74k-sharegpt | ---
dataset_info:
features:
- name: conversations
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 158838541
num_examples: 73928
download_size: 74237277
dataset_size: 158838541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
karmiq/wikipedia-embeddings-cs-seznam-mpnet | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: chunks
sequence: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2580729273
num_examples: 534044
download_size: 2307703671
dataset_size: 2580729273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- cs
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- fill-mask
license:
- cc-by-sa-3.0
- gfdl
---
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`Seznam/simcse-dist-mpnet-paracrawl-cs-en`](https://huggingface.co/Seznam/simcse-dist-mpnet-paracrawl-cs-en) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-seznam-mpnet", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos ) ...',
'Novověk Roku 1514 navrhl Mikuláš Koperník nový model, ve kterém bylo ...',
...,
],
'embeddings': [
[ 0.653917670249939, -0.879465639591217, 0.3993946313858032, ... ]
[ 0.0035442777443677187, -1.0201066732406616, -0.06573136150836945, ... ]
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: {info['index']['error']}"
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import os
import textwrap
import sentence_transformers
from sentence_transformers.models import Transformer, Pooling
from sentence_transformers import SentenceTransformer
from sentence_transformers.models import Transformer, Pooling
embedding_model = Transformer("Seznam/simcse-dist-mpnet-paracrawl-cs-en")
pooling = Pooling(word_embedding_dimension=embedding_model.get_word_embedding_dimension(), pooling_mode="cls")
model = SentenceTransformer(modules=[embedding_model, pooling])
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.72] Molekulová fyzika ( též molekulární fyzika ) je část fyziky, která zkoumá látky na úrovni atomů a… [Molekulová fyzika]
# [0.70] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 35 minutes on an NVIDIA A100 80GB.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
|
open-llm-leaderboard/details_TinyPixel__lima-test | ---
pretty_name: Evaluation run of TinyPixel/lima-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyPixel/lima-test](https://huggingface.co/TinyPixel/lima-test) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__lima-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T19:33:22.756804](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__lima-test/blob/main/results_2023-10-17T19-33-22.756804.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196461008,\n \"f1\": 0.05645763422818797,\n\
\ \"f1_stderr\": 0.0013211879752480866,\n \"acc\": 0.40996862682678575,\n\
\ \"acc_stderr\": 0.009889312522114477\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461008,\n\
\ \"f1\": 0.05645763422818797,\n \"f1_stderr\": 0.0013211879752480866\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07960576194086429,\n \
\ \"acc_stderr\": 0.007455924338676286\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TinyPixel/lima-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T19_33_22.756804
path:
- '**/details_harness|drop|3_2023-10-17T19-33-22.756804.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T19-33-22.756804.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T19_33_22.756804
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-33-22.756804.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-33-22.756804.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:10:45.645303.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:10:45.645303.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:10:45.645303.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T19_33_22.756804
path:
- '**/details_harness|winogrande|5_2023-10-17T19-33-22.756804.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T19-33-22.756804.parquet'
- config_name: results
data_files:
- split: 2023_08_28T09_10_45.645303
path:
- results_2023-08-28T09:10:45.645303.parquet
- split: 2023_10_17T19_33_22.756804
path:
- results_2023-10-17T19-33-22.756804.parquet
- split: latest
path:
- results_2023-10-17T19-33-22.756804.parquet
---
# Dataset Card for Evaluation run of TinyPixel/lima-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/lima-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/lima-test](https://huggingface.co/TinyPixel/lima-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__lima-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T19:33:22.756804](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__lima-test/blob/main/results_2023-10-17T19-33-22.756804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461008,
"f1": 0.05645763422818797,
"f1_stderr": 0.0013211879752480866,
"acc": 0.40996862682678575,
"acc_stderr": 0.009889312522114477
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461008,
"f1": 0.05645763422818797,
"f1_stderr": 0.0013211879752480866
},
"harness|gsm8k|5": {
"acc": 0.07960576194086429,
"acc_stderr": 0.007455924338676286
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arieg/bw_spec_cls_80_25 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '59702'
'1': '59706'
'2': '59707'
'3': '59708'
'4': '59709'
'5': '59710'
'6': '59719'
'7': '59720'
'8': '59721'
'9': '59723'
'10': '59724'
'11': '59725'
'12': '59726'
'13': '59727'
'14': '59823'
'15': '59876'
'16': '59930'
'17': '60037'
'18': '60038'
'19': '60041'
'20': '60042'
'21': '60045'
'22': '60048'
'23': '60074'
'24': '60143'
'25': '60144'
'26': '60145'
'27': '60146'
'28': '60170'
'29': '60317'
'30': '60472'
'31': '60474'
'32': '60477'
'33': '60478'
'34': '60510'
'35': '60544'
'36': '60547'
'37': '60548'
'38': '60549'
'39': '60736'
'40': '60753'
'41': '60754'
'42': '60755'
'43': '60756'
'44': '60757'
'45': '60758'
'46': '60775'
'47': '60776'
'48': '60777'
'49': '60857'
'50': '60864'
'51': '60865'
'52': '60994'
'53': '61006'
'54': '61007'
'55': '61008'
'56': '61010'
'57': '61011'
'58': '61012'
'59': '61013'
'60': '61159'
'61': '61160'
'62': '61161'
'63': '61172'
'64': '61174'
'65': '61175'
'66': '61452'
'67': '61453'
'68': '61491'
'69': '61492'
'70': '61493'
'71': '61587'
'72': '61589'
'73': '61591'
'74': '61592'
'75': '61668'
'76': '61670'
'77': '61679'
'78': '61814'
'79': '61884'
splits:
- name: train
num_bytes: 93110896.0
num_examples: 1600
- name: test
num_bytes: 22653803.0
num_examples: 400
download_size: 113211430
dataset_size: 115764699.0
---
# Dataset Card for "bw_spec_cls_80_25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
notrichardren/truthfulness_high_quality | ---
configs:
- config_name: default
data_files:
- split: combined
path: data/combined-*
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: claim
dtype: string
- name: label
dtype: int64
- name: dataset
dtype: string
- name: qa_type
dtype: int64
- name: ind
dtype: int64
splits:
- name: combined
num_bytes: 12293840
num_examples: 106153
- name: train
num_bytes: 9841127
num_examples: 84922
- name: test
num_bytes: 2452713
num_examples: 21231
download_size: 6560566
dataset_size: 24587680
---
# Dataset Card for "truthfulness_high_quality"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joefox/Mozilla_Common_Voice_en_test_noise | ---
license: apache-2.0
---
### Dataset Summary
Augmented part of the test data of the Mozilla Common Voice (part 10, en, test) dataset.
As a basis, the original part of the test was taken, and augmentation was carried out to add extraneous noise.
Part dataset: test
|
danaroth/cave | ---
license: unknown
---
# Description
This database contains a set multispectral images that were used to emulate the GAP camera. The images are of a wide variety of real-world materials and objects.
# Image capture information
| | |
|---------------------|----------------------------------------|
| Camera | Cooled CCD camera (Apogee Alta U260) |
| Resolution | 512 x 512 pixel |
| Filter | VariSpec liquid crystal tunable filter |
| Illuminant | CIE Standard Illuminant D65 |
| Range of wevelength | 400nm - 700nm |
| Steps | 10nm |
| Number of band | 31 band |
| Focal length | f/1.4 |
| Focus | Fixed (focused using 550nm image) |
| Image format | PNG (16bit) |
NOTE: These multispectral images represent the reflectances of the materials in the scene. The reflectance spectrum at each pixel was computed from the measured multispectrum image using calibrated estimates of the illumination spectrum and camera spectral response. As a result, each multispectral image should be taken as a close approximation, rather than exact measurement, of the true reflectance of the scene.
# Database organization
The database consists of 32 scenes, divided into 5 sections. Each scene has an associated zip file. These zip files include full spectral resolution reflectance data from 400nm to 700nm at 10nm steps (31 bands total). Each band is stored as a 16-bit grayscale PNG image. Image filenames are of the format 'object_ms_01.png', where the '01' at the end signifies that this is the first image (captured at 400nm). Thus, '02' corresponds to 410nm, and so on, until '31' for 700nm.
Each scene also contains a single representative color image, displayed using sRGB values rendered under a neutral daylight illuminant (D65).
# Quick look
<figure>
<img src= "assets/teaser.png" alt="CAVE" width="500" />
<figcaption>CAVE dataset teaser image.</figcaption>
</figure>
# Access instructions
The database of 32 scenes is divided into 5 sections, as shown below:
- [Stuff](https://www.cs.columbia.edu/CAVE/databases/multispectral/stuff/)
<figure>
<img src= "assets/thumbnail_stuff.png" alt="CAVE stuff" width="500" />
</figure>
- [Skin and hair](https://www.cs.columbia.edu/CAVE/databases/multispectral/skin_and_hair/)
<figure>
<img src= "assets/thumbnail_skin_and_hair.png" alt="CAVE skin and hair" width="300" />
</figure>
- [Paints](https://www.cs.columbia.edu/CAVE/databases/multispectral/paints/)
<figure>
<img src= "assets/thumbnail_paints.png" alt="CAVE paints" width="200" />
</figure>
- [Food and drinks](https://www.cs.columbia.edu/CAVE/databases/multispectral/food_and_drinks/)
<figure>
<img src= "assets/thumbnail_food_and_drinks.png" alt="CAVE food and drinks" width="200" />
</figure>
- [Real and fake](https://www.cs.columbia.edu/CAVE/databases/multispectral/real_and_fake/)
<figure>
<img src= "assets/thumbnail_real_and_fake.png" alt="CAVE real and fake" width="600" />
</figure>
# Related project
[Generalized assorted pixel camera](https://www.cs.columbia.edu/CAVE/projects/gap_camera/)
# Credits
The dataset was originally provided by Columbia University and available at:
https://www.cs.columbia.edu/CAVE/databases/multispectral/
# Publication
Details of the database can be found in the following publication:
```
"Generalized Assorted Pixel Camera: Post-Capture Control of Resolution, Dynamic Range and Spectrum,"
F. Yasuma, T. Mitsunaga, D. Iso, and S.K. Nayar,
Technical Report, Department of Computer Science, Columbia University CUCS-061-08, Nov. 2008.
```
[PDF](https://www1.cs.columbia.edu/CAVE/publications/pdfs/Yasuma_TR08.pdf) - [bib](https://www1.cs.columbia.edu/CAVE/publications/pub_bibtex.php?id=293) - [(C)](http://www1.cs.columbia.edu/CAVE/publications/copyright.html) - [Project Page](http://www.cs.columbia.edu/CAVE/projects/gap_camera/)
|
bragovo/dsum_ru | ---
configs:
- config_name: dsum
task: summarization
data_files:
- split: train
path: data/train-*
language:
- ru
--- |
CyberHarem/nana_darlinginthefranxx | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nana/ナナ (Darling in the FranXX)
This is the dataset of Nana/ナナ (Darling in the FranXX), containing 241 images and their tags.
The core tags of this character are `long_hair, red_hair, green_eyes, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 241 | 125.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nana_darlinginthefranxx/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 241 | 125.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nana_darlinginthefranxx/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 405 | 190.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nana_darlinginthefranxx/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nana_darlinginthefranxx',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, red_necktie, solo, uniform, upper_body, closed_mouth, shirt |
| 1 | 5 |  |  |  |  |  | 1girl, closed_mouth, portrait, solo, uniform, upper_body |
| 2 | 10 |  |  |  |  |  | 1girl, solo, uniform, red_necktie, upper_body, breasts, long_sleeves, smile, closed_mouth, looking_at_viewer, open_mouth |
| 3 | 9 |  |  |  |  |  | 1girl, garrison_cap, long_sleeves, military_uniform, necktie, solo, breasts, closed_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, black_skirt, long_skirt, long_sleeves, looking_at_viewer, necktie, standing, closed_mouth, solo, crossed_arms, grey_shirt, black_eyes, blurry, high-waist_skirt, pantyhose |
| 5 | 5 |  |  |  |  |  | 1girl, solo, curtains, 1boy |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | red_necktie | solo | uniform | upper_body | closed_mouth | shirt | portrait | breasts | long_sleeves | smile | looking_at_viewer | open_mouth | garrison_cap | military_uniform | necktie | black_skirt | long_skirt | standing | crossed_arms | grey_shirt | black_eyes | blurry | high-waist_skirt | pantyhose | curtains | 1boy |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------|:----------|:-------------|:---------------|:--------|:-----------|:----------|:---------------|:--------|:--------------------|:-------------|:---------------|:-------------------|:----------|:--------------|:-------------|:-----------|:---------------|:-------------|:-------------|:---------|:-------------------|:------------|:-----------|:-------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | | | X | | | X | X | | | | X | X | X | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | X | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X |
|
tyzhu/lmind_hotpot_train300_eval100_v1_doc | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 51441
num_examples: 300
- name: train_recite_qa
num_bytes: 312070
num_examples: 300
- name: eval_qa
num_bytes: 16148
num_examples: 100
- name: eval_recite_qa
num_bytes: 104950
num_examples: 100
- name: all_docs
num_bytes: 361191
num_examples: 797
- name: all_docs_eval
num_bytes: 361140
num_examples: 797
- name: train
num_bytes: 361191
num_examples: 797
- name: validation
num_bytes: 361191
num_examples: 797
download_size: 546922
dataset_size: 1929322
---
# Dataset Card for "lmind_hotpot_train300_eval100_v1_doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nicholasbien/lmd_full_txt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10680041053.27998
num_examples: 139604
- name: test
num_bytes: 2670086765.7200212
num_examples: 34902
download_size: 4904716269
dataset_size: 13350127819.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yangyz1230/H3K36me3_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 495405
num_examples: 901
- name: test
num_bytes: 59257
num_examples: 108
download_size: 264558
dataset_size: 554662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tyzhu/squad_qa_wrong_num_v5_full_recite_full_passage_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8767142.788690874
num_examples: 4778
- name: validation
num_bytes: 584108
num_examples: 300
download_size: 1751753
dataset_size: 9351250.788690874
---
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_12_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29852717
num_examples: 16466
download_size: 15417151
dataset_size: 29852717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_12_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itamarcard/aula | ---
license: openrail
---
|
RUCAIBox/Translation | ---
language:
- en
- fr
- de
- cs
- es
- zh
- ru
multilinguality:
- translation
task_categories:
- translation
task_ids: []
---
This is the translation datasets collected by TextBox, including:
- WMT14 English-French (wmt14-fr-en)
- WMT16 Romanian-English (wmt16-ro-en)
- WMT16 German-English (wmt16-de-en)
- WMT19 Czech-English (wmt19-cs-en)
- WMT13 Spanish-English (wmt13-es-en)
- WMT19 Chinese-English (wmt19-zh-en)
- WMT19 Russian-English (wmt19-ru-en).
The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset). |
cyrilzhang/wiki-bpe-64k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 20157432700
num_examples: 4916447
- name: test
num_bytes: 202663000
num_examples: 49430
download_size: 8837145740
dataset_size: 20360095700
---
# Dataset Card for "wiki-bpe-64k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nllg/datikz | ---
dataset_info:
features:
- name: caption
dtype: string
- name: code
dtype: string
- name: image
dtype: image
- name: uri
dtype: string
- name: origin
dtype: string
- name: date
dtype: timestamp[us]
- name: augmented
dtype: bool
splits:
- name: test
num_bytes: 8894508.0
num_examples: 561
- name: train
num_bytes: 704548016.5
num_examples: 49692
download_size: 644268174
dataset_size: 713442524.5
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for DaTi*k*Z
DaTi*k*Z is a dataset of Ti*k*Z drawings aligned with captions. In compliance with licensing agreements, certain Ti*k*Z drawings are excluded from this public version of the dataset. Check out the [AutomaTi*k*Z](https://github.com/potamides/AutomaTikZ) project and the [DaTi*k*Z](https://github.com/potamides/AutomaTikZ/tree/main/dataset) repository for more information as well as tools and methods to recreate the complete dataset from scratch.
## Usage
```python
from datasets import load_dataset
# full dataset
ds = load_dataset("nllg/datikz")
# only the train split
ds = load_dataset("nllg/datikz", split="train")
for sample in ds:
print(sample["code"])
``` |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_3_t_0.25 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43643773
num_examples: 18928
- name: epoch_1
num_bytes: 44106410
num_examples: 18928
- name: epoch_2
num_bytes: 44192744
num_examples: 18928
- name: epoch_3
num_bytes: 44244866
num_examples: 18928
- name: epoch_4
num_bytes: 44266021
num_examples: 18928
- name: epoch_5
num_bytes: 44279686
num_examples: 18928
- name: epoch_6
num_bytes: 44272065
num_examples: 18928
- name: epoch_7
num_bytes: 44269328
num_examples: 18928
- name: epoch_8
num_bytes: 44263537
num_examples: 18928
- name: epoch_9
num_bytes: 44266527
num_examples: 18928
- name: epoch_10
num_bytes: 44262806
num_examples: 18928
- name: epoch_11
num_bytes: 44266275
num_examples: 18928
- name: epoch_12
num_bytes: 44264620
num_examples: 18928
- name: epoch_13
num_bytes: 44264755
num_examples: 18928
- name: epoch_14
num_bytes: 44264932
num_examples: 18928
- name: epoch_15
num_bytes: 44263094
num_examples: 18928
- name: epoch_16
num_bytes: 44265898
num_examples: 18928
- name: epoch_17
num_bytes: 44264475
num_examples: 18928
- name: epoch_18
num_bytes: 44265226
num_examples: 18928
- name: epoch_19
num_bytes: 44264947
num_examples: 18928
- name: epoch_20
num_bytes: 44267119
num_examples: 18928
- name: epoch_21
num_bytes: 44266983
num_examples: 18928
- name: epoch_22
num_bytes: 44265235
num_examples: 18928
- name: epoch_23
num_bytes: 44267098
num_examples: 18928
- name: epoch_24
num_bytes: 44266826
num_examples: 18928
- name: epoch_25
num_bytes: 44266038
num_examples: 18928
- name: epoch_26
num_bytes: 44266984
num_examples: 18928
- name: epoch_27
num_bytes: 44266929
num_examples: 18928
- name: epoch_28
num_bytes: 44266608
num_examples: 18928
- name: epoch_29
num_bytes: 44267982
num_examples: 18928
download_size: 680443250
dataset_size: 1327119787
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
AyoubChLin/CNN_Long_articles | ---
license: apache-2.0
task_categories:
- text-classification
- zero-shot-classification
- text-generation
language:
- en
size_categories:
- 1K<n<10K
--- |
n0w0f/nomad-structure-csv | ---
license: cc-by-4.0
---
|
jxu124/llava_detail_23k | ---
dataset_info:
features:
- name: global_image_id
dtype: string
- name: image_path
dtype: string
- name: dialog
sequence:
sequence: string
- name: anns_id
dtype: string
splits:
- name: train
num_bytes: 17698232
num_examples: 23240
download_size: 7640667
dataset_size: 17698232
---
# Dataset Card for "llava_detail_23k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mwitiderrick/AlpacaCode | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- question-answering
- text-generation
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4817562
num_examples: 1073
download_size: 1633970
dataset_size: 4817562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# AlpcaCode
This is a version of [Alpaca Code](https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style) formatted for instruction fine-tuning using the following
prompt template:
```
### Instruction:
Instruction
### Input:
Input
### Response:
``` |
xwjiang2010/pile_dedupe_train_tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 100197587176
num_examples: 15000000
download_size: 46305700294
dataset_size: 100197587176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nezhazheng/myspider | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: db_id
dtype: string
- name: query
dtype: string
- name: question
dtype: string
- name: query_toks
sequence: string
- name: query_toks_no_value
sequence: string
- name: question_toks
sequence: string
splits:
- name: train
num_bytes: 630265
num_examples: 1001
- name: validation
num_bytes: 663747
num_examples: 1001
download_size: 238230
dataset_size: 1294012
---
|
open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e | ---
pretty_name: Evaluation run of yunconglong/7Bx4_DPO_2e
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/7Bx4_DPO_2e](https://huggingface.co/yunconglong/7Bx4_DPO_2e) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T07:53:49.728301](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e/blob/main/results_2024-01-20T07-53-49.728301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6497382445744926,\n\
\ \"acc_stderr\": 0.032094653254152825,\n \"acc_norm\": 0.6495590967657305,\n\
\ \"acc_norm_stderr\": 0.03275779052761359,\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6560145251092138,\n\
\ \"mc2_stderr\": 0.014910139553633708\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192309,\n\
\ \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6793467436765585,\n\
\ \"acc_stderr\": 0.004657738398900936,\n \"acc_norm\": 0.8679545907189803,\n\
\ \"acc_norm_stderr\": 0.003378482488748873\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.01655328786311604,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.01655328786311604\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.01750317326096063,\n \"mc2\": 0.6560145251092138,\n\
\ \"mc2_stderr\": 0.014910139553633708\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491894\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337695\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/7Bx4_DPO_2e
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|arc:challenge|25_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|gsm8k|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hellaswag|10_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T07-53-49.728301.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- '**/details_harness|winogrande|5_2024-01-20T07-53-49.728301.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T07-53-49.728301.parquet'
- config_name: results
data_files:
- split: 2024_01_20T07_53_49.728301
path:
- results_2024-01-20T07-53-49.728301.parquet
- split: latest
path:
- results_2024-01-20T07-53-49.728301.parquet
---
# Dataset Card for Evaluation run of yunconglong/7Bx4_DPO_2e
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/7Bx4_DPO_2e](https://huggingface.co/yunconglong/7Bx4_DPO_2e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T07:53:49.728301](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__7Bx4_DPO_2e/blob/main/results_2024-01-20T07-53-49.728301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6497382445744926,
"acc_stderr": 0.032094653254152825,
"acc_norm": 0.6495590967657305,
"acc_norm_stderr": 0.03275779052761359,
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6560145251092138,
"mc2_stderr": 0.014910139553633708
},
"harness|arc:challenge|25": {
"acc": 0.6672354948805461,
"acc_stderr": 0.013769863046192309,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053067
},
"harness|hellaswag|10": {
"acc": 0.6793467436765585,
"acc_stderr": 0.004657738398900936,
"acc_norm": 0.8679545907189803,
"acc_norm_stderr": 0.003378482488748873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.01655328786311604,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.01655328786311604
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5030599755201959,
"mc1_stderr": 0.01750317326096063,
"mc2": 0.6560145251092138,
"mc2_stderr": 0.014910139553633708
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491894
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davanstrien/autotrain-data-dataset-mentions | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: dataset-mentions
## Dataset Description
This dataset has been automatically processed by AutoTrain for project dataset-mentions.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": " How to use ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained(\"Geotrend/bert-base-en-fr-zh-ja-vi-cased\") model = AutoModel.from_pretrained(\"Geotrend/bert-base-en-fr-zh-ja-vi-cased\") ``` To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers). ",
"target": 0
},
{
"text": " Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['dataset_mention', 'no_dataset_mention'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 7428 |
| valid | 1858 |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.