datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Rohit-D/synthetic-confidential-information-injected-business-excerpts | ---
license: mit
task_categories:
- question-answering
- text-classification
- feature-extraction
- summarization
language:
- en
tags:
- business
- fine-tuning
size_categories:
- n<=1K
---
## Synthetic Confidential Information Injected Business Excerpts
This dataset aims to provide business report excerpts which contain relevant confidential/sensitive information.
<pre>
This includes mentions of :
1. Internal Marketing Strategies.
2. Proprietary Product Composition.
3. License Internals.
4. Internal Sales Projections.
5. Confidential Patent Details.
6. others.
</pre>
The dataset contains around 1k business excerpt - Reasons pairs. The Reason field contains the confidential portion from the business excerpt field
in quotes and also reasons succintly (in about a line) as to why the quoted portion might be confidential.
**Note** : All 'confidential information' injected is purely artifical and the business excerpts themselves along with companies, products, numbers, licenses, patents they reference or mention are hypothetical and artificial.
This data is to be treated as pure simulation of what leaks in business excerpts might look like.
This data does not contain or intend to provide any kind of actual/real cases of confidential information. |
Nadav-Timor/CUAD | ---
paperswithcode_id: cuad
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question_id
dtype: string
- name: question
dtype: string
- name: answer_text
dtype: string
- name: answer_start
dtype: int64
splits:
- name: train
num_bytes: 1142083198
num_examples: 13823
download_size: 14209324
dataset_size: 1142083198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CUAD"
https://arxiv.org/pdf/2103.06268.pdf |
gimmaru/glue-sst2 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: idx
dtype: int32
splits:
- name: validation
num_bytes: 106252
num_examples: 872
download_size: 0
dataset_size: 106252
---
# Dataset Card for "glue-sst2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Note: This dataset was utilized for the evaluation of probability-based prompt selection techniques in the paper '[Improving Probability-based Prompt Selection Through Unified Evaluation and Analysis](https://arxiv.org/abs/2305.14877)'. It differs from the actual benchmark dataset. |
chronbmm/sandhi-split-long-pali | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 162538208
num_examples: 1228916
- name: validation
num_bytes: 1147382
num_examples: 9210
- name: test
num_bytes: 1185100
num_examples: 8761
- name: test_500
num_bytes: 57767
num_examples: 500
- name: validation_500
num_bytes: 64344
num_examples: 500
download_size: 90710303
dataset_size: 164992801
---
# Dataset Card for "sandhi-split-long-pali"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning | ---
pretty_name: Evaluation run of iGenius-AI-Team/LLAMA-13B-test-finetuning
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [iGenius-AI-Team/LLAMA-13B-test-finetuning](https://huggingface.co/iGenius-AI-Team/LLAMA-13B-test-finetuning)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:13:07.767154](https://huggingface.co/datasets/open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning/blob/main/results_2023-12-02T15-13-07.767154.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.22517058377558757,\n\
\ \"acc_stderr\": 0.011505385424294625\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.22517058377558757,\n \"acc_stderr\": 0.011505385424294625\n\
\ }\n}\n```"
repo_url: https://huggingface.co/iGenius-AI-Team/LLAMA-13B-test-finetuning
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T15_13_07.767154
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-13-07.767154.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-13-07.767154.parquet'
- config_name: results
data_files:
- split: 2023_12_02T15_13_07.767154
path:
- results_2023-12-02T15-13-07.767154.parquet
- split: latest
path:
- results_2023-12-02T15-13-07.767154.parquet
---
# Dataset Card for Evaluation run of iGenius-AI-Team/LLAMA-13B-test-finetuning
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/iGenius-AI-Team/LLAMA-13B-test-finetuning
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [iGenius-AI-Team/LLAMA-13B-test-finetuning](https://huggingface.co/iGenius-AI-Team/LLAMA-13B-test-finetuning) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:13:07.767154](https://huggingface.co/datasets/open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning/blob/main/results_2023-12-02T15-13-07.767154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.22517058377558757,
"acc_stderr": 0.011505385424294625
},
"harness|gsm8k|5": {
"acc": 0.22517058377558757,
"acc_stderr": 0.011505385424294625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pianoroll/maestro-do-storage | ---
dataset_info:
features:
- name: composer
dtype: string
- name: title
dtype: string
- name: midi_filename
dtype: string
- name: mp3_key
dtype: string
- name: pianoroll_key
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 419735
num_examples: 1276
download_size: 89454
dataset_size: 419735
---
# Dataset Card for "maestro-do-storage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kunishou/amenokaku-code-instruct | ---
license: other
license_name: mixed-licence
license_link: LICENSE
language:
- ja
configs:
- config_name: default
data_files:
- split: train
path: "amenokaku_code_instruct.json"
---

# Amenokaku-Code-Instruct
**Update:**
- 2023/12/27
データセットに JaxTon , プロになるJava のコードデータ 180 レコードを追加しました。
## 概要
- コードに特化した5.2KのInstructionデータセットです。
- データセットに含まれるデータは商用利用できるラインセンスが付与されたプログラミング学習コンテンツから収集、加工し作成しました(英語のコンテンツは日本語に自動翻訳し、翻訳の不自然な箇所を手動で修正)。
- また、ライセンスが明記されていない学習コンテンツについては権利者に個別に連絡を取り、本データセットへの掲載の許諾を得ております。
## データセット詳細
指示タスクの内訳としてはコード生成(code_generation)が1050レコード、コードの挙動確認(check_code_behavor)が150レコード、コードのバグ修正(code_fix)が4000レコードになります。
詳細な内訳は以下の通りになります。
|source name|num record|licence|url|
|:----|:----|:----|:----|
|データサイエンス100本ノック(構造化データ加工編)(Python解答)|100|[MIT](https://github.com/The-Japan-DataScientist-Society/100knocks-preprocess/blob/master/LICENSE)|https://github.com/The-Japan-DataScientist-Society/100knocks-preprocess|
|データサイエンス100本ノック(構造化データ加工編)(SQL解答)|100|[MIT](https://github.com/rootassist/100knocks-preprocess-inSQLandPython-withColab/blob/master/LICENSE)|https://github.com/rootassist/100knocks-preprocess-inSQLandPython-withColab|
|画像処理100本ノック|100|[MIT](https://github.com/ryoppippi/Gasyori100knock/blob/master/LICENSE)|https://github.com/ryoppippi/Gasyori100knock|
|言語処理100本ノック2020|100|[MIT](https://github.com/nlp100/nlp100.github.io/blob/develop/LICENSE)<br>[MIT](https://github.com/upura/nlp100v2020/blob/master/LICENSE)|(問題) https://github.com/nlp100/nlp100.github.io<br>(解答) https://github.com/upura/nlp100v2020|
|Python初学者のためのpandas100本ノック※|100|AmenokakuCode Liscence|https://qiita.com/kunishou/items/bd5fad9a334f4f5be51c|
|Python初学者のためのPolars100本ノック※|100|AmenokakuCode Liscence|https://qiita.com/kunishou/items/1386d14a136f585e504e|
|100 Numpy Execieses|100|[MIT](https://github.com/rougier/numpy-100/blob/master/LICENSE.txt)|https://github.com/rougier/numpy-100|
|100 Julia Exercises|100|The Unliscence|https://github.com/RoyiAvital/Julia100Exercises|
|自作Python100本ノック|100|AmenokakuCode Liscence|https://qiita.com/ahpjop/items/373f807d68044cda1c9b|
|Python-for-Beginners-Solve-50-Exercises-Live|50|[MIT](https://github.com/garg10may/Python-for-Beginners-Solve-50-Exercises-Live/blob/master/LICENSE)|https://github.com/garg10may/Python-for-Beginners-Solve-50-Exercises-Live|
|R初学者のためのtidyverse100本ノック|100|AmenokakuCode Liscence|https://qiita.com/nekobo/items/cbf32a13637273f229da|
|JavaScript Questions|155|[MIT](https://github.com/lydiahallie/javascript-questions/blob/master/LICENSE)|https://github.com/lydiahallie/javascript-questions|
|Break-It-Fix-It|4,000|[MIT](https://github.com/michiyasunaga/BIFI/blob/main/LICENSE)|https://github.com/michiyasunaga/BIFI|
|JaxTon|60|Apache-2.0|https://github.com/vopani/jaxton|
|プロになるJava|120|AmenokakuCode Liscence|https://nowokay.hatenablog.com/entry/projava17exercise2|
※ 私が過去に作成した学習コンテンツです。
## ライセンス
個々のデータのライセンスは収集元のライセンスに従うため、データセット全体では混合ライセンスになります。
また、データ自体にライセンスが明記されておらず個別に権利者に言語モデル学習用途でデータセットへの掲載許諾を取ったデータに関しては [AmenokakuCode Licence](https://github.com/kunishou/amenokaku-code-instruct/blob/main/AmenokakuCode%20License) というライセンスを付与しています。このライセンスは、言語モデルでの学習用途に限り自由にデータを利用することを許可するものになります(そのため、データ自体を販売したり、配布することは認めていません)。
## データセットの更新
データセットについては、商用利用可能なプログラミング学習コンテンツを見つけたら今後随時追加していきたいと思います。
**もし、有益なコンテンツを見つけたり、自身で作成した学習コンテンツを提供しても良いという方がおりましたら是非ご連絡下さい。**
## データセット名
Amenokaku は古事記に登場する[天迦久神](http://kojiki.kokugakuin.ac.jp/shinmei/amenokakunokami/)(あめのかくのかみ)という鹿の神様の名前を参考にしました。
## Github
https://github.com/kunishou/amenokaku-code-instruct |
naufalnashif/tweets-biskita-transpakuan-2022 | ---
license: mit
---
|
open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough | ---
pretty_name: Evaluation run of allknowingroger/FrankenLimmy-10B-passthrough
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/FrankenLimmy-10B-passthrough](https://huggingface.co/allknowingroger/FrankenLimmy-10B-passthrough)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T06:52:11.506135](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough/blob/main/results_2024-04-11T06-52-11.506135.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6423348635921872,\n\
\ \"acc_stderr\": 0.03231625223252546,\n \"acc_norm\": 0.6446030902485047,\n\
\ \"acc_norm_stderr\": 0.03297304000372783,\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.7379035451562636,\n\
\ \"mc2_stderr\": 0.014559397581751874\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038169,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7160924118701454,\n\
\ \"acc_stderr\": 0.004499710284381918,\n \"acc_norm\": 0.8863772156940849,\n\
\ \"acc_norm_stderr\": 0.0031670398072286784\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276876,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276876\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059274,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059274\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n\
\ \"acc_stderr\": 0.015990154885073368,\n \"acc_norm\": 0.8330275229357799,\n\
\ \"acc_norm_stderr\": 0.015990154885073368\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n\
\ \"acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768424,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768424\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662253,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760841,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760841\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904664,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n\
\ \"acc_stderr\": 0.012767793787729333,\n \"acc_norm\": 0.49022164276401564,\n\
\ \"acc_norm_stderr\": 0.012767793787729333\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.7379035451562636,\n\
\ \"mc2_stderr\": 0.014559397581751874\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5170583775587566,\n \
\ \"acc_stderr\": 0.013764467123761316\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/FrankenLimmy-10B-passthrough
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-52-11.506135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T06-52-11.506135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- '**/details_harness|winogrande|5_2024-04-11T06-52-11.506135.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T06-52-11.506135.parquet'
- config_name: results
data_files:
- split: 2024_04_11T06_52_11.506135
path:
- results_2024-04-11T06-52-11.506135.parquet
- split: latest
path:
- results_2024-04-11T06-52-11.506135.parquet
---
# Dataset Card for Evaluation run of allknowingroger/FrankenLimmy-10B-passthrough
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/FrankenLimmy-10B-passthrough](https://huggingface.co/allknowingroger/FrankenLimmy-10B-passthrough) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T06:52:11.506135](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough/blob/main/results_2024-04-11T06-52-11.506135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6423348635921872,
"acc_stderr": 0.03231625223252546,
"acc_norm": 0.6446030902485047,
"acc_norm_stderr": 0.03297304000372783,
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.7379035451562636,
"mc2_stderr": 0.014559397581751874
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038169,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7160924118701454,
"acc_stderr": 0.004499710284381918,
"acc_norm": 0.8863772156940849,
"acc_norm_stderr": 0.0031670398072286784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276876,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276876
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059274,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059274
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073368,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073368
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768424,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768424
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662253,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904664,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729333,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.01860755213127983,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.01860755213127983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.7379035451562636,
"mc2_stderr": 0.014559397581751874
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
},
"harness|gsm8k|5": {
"acc": 0.5170583775587566,
"acc_stderr": 0.013764467123761316
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
owanr/o1o2o3_large_r2_iterater_with_human_pref_practice | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 13000854
num_examples: 34758
- name: val
num_bytes: 649176
num_examples: 1692
- name: test
num_bytes: 666158
num_examples: 1707
download_size: 2384308
dataset_size: 14316188
---
# Dataset Card for "o1o2o3_large_r2_iterater_with_human_pref_practice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cstr/intel_orca_dpo_pairs_de | ---
language:
- de
license: apache-2.0
---
german auzureml translation from mayflowergmbh/intel_orca_dpo_pairs_de, here only put back to original jsonl structure |
RicardoRei/wmt-mqm-error-spans | ---
license: apache-2.0
language:
- en
- de
- ru
- zh
tags:
- mt-evaluation
- WMT
- MQM
size_categories:
- 100K<n<1M
---
# Dataset Summary
This dataset contains all MQM human annotations from previous [WMT Metrics shared tasks](https://wmt-metrics-task.github.io/) and the MQM annotations from [Experts, Errors, and Context](https://aclanthology.org/2021.tacl-1.87/) in a form of error spans. Moreover, it contains some hallucinations used in the training of [XCOMET models](https://huggingface.co/Unbabel/XCOMET-XXL).
**Please note that this is not an official release of the data** and the original data can be found [here](https://github.com/google/wmt-mqm-human-evaluation).
The data is organised into 8 columns:
- src: input text
- mt: translation
- ref: reference translation
- annotations: List of error spans (dictionaries with 'start', 'end', 'severity', 'text')
- lp: language pair
While `en-ru` was annotated by Unbabel, `en-de` and `zh-en` was annotated by Google. This means that for en-de and zh-en you will only find minor and major errors while for en-ru you can find a few critical errors.
## Python usage:
```python
from datasets import load_dataset
dataset = load_dataset("RicardoRei/wmt-mqm-error-spans", split="train")
```
There is no standard train/test split for this dataset but you can easily split it according to year, language pair or domain. E.g. :
```python
# split by LP
data = dataset.filter(lambda example: example["lp"] == "en-de")
```
## Citation Information
If you use this data please cite the following works:
- [Experts, Errors, and Context: A Large-Scale Study of Human Evaluation for Machine Translation](https://aclanthology.org/2021.tacl-1.87/)
- [Results of the WMT21 Metrics Shared Task: Evaluating Metrics with Expert-based Human Evaluations on TED and News Domain](https://aclanthology.org/2021.wmt-1.73/)
- [Results of WMT22 Metrics Shared Task: Stop Using BLEU – Neural Metrics Are Better and More Robust](https://aclanthology.org/2022.wmt-1.2/)
- [xCOMET: Transparent Machine Translation Evaluation through Fine-grained Error Detection](https://arxiv.org/pdf/2310.10482.pdf)
|
SuperrWu/my_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8337027.0
num_examples: 4
download_size: 7674122
dataset_size: 8337027.0
---
# Dataset Card for "my_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
A-Bar/nl-de_non_top_cs_dev | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 42259840
num_examples: 100000
download_size: 17593225
dataset_size: 42259840
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_zero_degree | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 79962
num_examples: 336
- name: dev_mismatched
num_bytes: 91330
num_examples: 356
- name: test_matched
num_bytes: 72676
num_examples: 310
- name: test_mismatched
num_bytes: 88644
num_examples: 359
- name: train
num_bytes: 3469048
num_examples: 14200
download_size: 2375002
dataset_size: 3801660
---
# Dataset Card for "MULTI_VALUE_mnli_zero_degree"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelmallari/airbnb-usa-ca-sandiego | ---
license: mit
---
|
kheopss/f3.0_f4.0_to_hermes | ---
dataset_info:
features:
- name: text
dtype: string
- name: text2
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 22526464
num_examples: 2460
download_size: 8310426
dataset_size: 22526464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zdreiosis/ffa_grab_1 | ---
license: other
---
|
senhorsapo/merli | ---
license: openrail
---
|
goodemagod/sommy-2.5 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 956074
num_examples: 1000
download_size: 553417
dataset_size: 956074
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AShabana/thenewtest | ---
license: apache-2.0
---
|
lmwang/MultiSports | ---
license: mit
---
|
zicsx/OSCAR-2301-Hindi-Cleaned | ---
license: apache-2.0
task_categories:
- text-generation
language:
- hi
tags:
- ' OSCAR-2301'
size_categories:
- 100K<n<1M
---
# Dataset Card for "OSCAR-2301-Hindi-Cleaned-2.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/anli_triples | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8130810
num_examples: 17965
download_size: 3849833
dataset_size: 8130810
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxm/ag_news__gtr_base__dpr | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float32
- name: embeddings_B
sequence: float32
splits:
- name: train
num_bytes: 48573874
num_examples: 7600
download_size: 57917827
dataset_size: 48573874
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-2fa37c-16136228 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
autoevaluate/autoeval-eval-emotion-default-2be497-1508254837 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: morenolq/distilbert-base-cased-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: morenolq/distilbert-base-cased-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@morenolq](https://huggingface.co/morenolq) for evaluating this model. |
joey234/mmlu-high_school_mathematics-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6798
num_examples: 5
- name: test
num_bytes: 654905
num_examples: 270
download_size: 15082
dataset_size: 661703
---
# Dataset Card for "mmlu-high_school_mathematics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yam-peleg__gemma-7b-experiment | ---
pretty_name: Evaluation run of yam-peleg/gemma-7b-experiment
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/gemma-7b-experiment](https://huggingface.co/yam-peleg/gemma-7b-experiment)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__gemma-7b-experiment\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T15:33:57.987521](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__gemma-7b-experiment/blob/main/results_2024-03-16T15-33-57.987521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6580452433778683,\n\
\ \"acc_stderr\": 0.03198812334565303,\n \"acc_norm\": 0.662225563457007,\n\
\ \"acc_norm_stderr\": 0.03262216078960403,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4490548840372056,\n\
\ \"mc2_stderr\": 0.014654652028381131\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.622087233618801,\n\
\ \"acc_stderr\": 0.0048387473057833474,\n \"acc_norm\": 0.8247361083449513,\n\
\ \"acc_norm_stderr\": 0.0037941565512722643\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \
\ \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.030588697013783642,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.030588697013783642\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.0402614149763461,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.0402614149763461\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867433,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867433\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.01318222261672089,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.01318222261672089\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.016407123032195253,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.016407123032195253\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340866,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472658,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472658\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399663,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399663\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4490548840372056,\n\
\ \"mc2_stderr\": 0.014654652028381131\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5276724791508719,\n \
\ \"acc_stderr\": 0.013751375538801323\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/gemma-7b-experiment
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|arc:challenge|25_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|gsm8k|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hellaswag|10_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-33-57.987521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T15-33-57.987521.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- '**/details_harness|winogrande|5_2024-03-16T15-33-57.987521.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T15-33-57.987521.parquet'
- config_name: results
data_files:
- split: 2024_03_16T15_33_57.987521
path:
- results_2024-03-16T15-33-57.987521.parquet
- split: latest
path:
- results_2024-03-16T15-33-57.987521.parquet
---
# Dataset Card for Evaluation run of yam-peleg/gemma-7b-experiment
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/gemma-7b-experiment](https://huggingface.co/yam-peleg/gemma-7b-experiment) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__gemma-7b-experiment",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T15:33:57.987521](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__gemma-7b-experiment/blob/main/results_2024-03-16T15-33-57.987521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6580452433778683,
"acc_stderr": 0.03198812334565303,
"acc_norm": 0.662225563457007,
"acc_norm_stderr": 0.03262216078960403,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4490548840372056,
"mc2_stderr": 0.014654652028381131
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.622087233618801,
"acc_stderr": 0.0048387473057833474,
"acc_norm": 0.8247361083449513,
"acc_norm_stderr": 0.0037941565512722643
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.030588697013783642,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.030588697013783642
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.0402614149763461,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.0402614149763461
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867433,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867433
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.01318222261672089,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.01318222261672089
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.016407123032195253,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.016407123032195253
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340866,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472658,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472658
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399663,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399663
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4490548840372056,
"mc2_stderr": 0.014654652028381131
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5276724791508719,
"acc_stderr": 0.013751375538801323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
danjacobellis/audio_har_descript_44kHz_frames_1240_01p | ---
dataset_info:
features:
- name: codes
dtype:
array2_d:
shape:
- 9
- 640
dtype: float32
- name: label
dtype:
class_label:
names:
'0': No Activity
'1': Writing
'2': Drawing
'3': Cutting paper
'4': Typing on keyboard
'5': Typing on phone
'6': Browsing on phone
'7': Clapping
'8': Shuffling cards
'9': Scratching
'10': Wiping table
'11': Brushing hair
'12': Washing hands
'13': Drinking
'14': Eating snacks
'15': Brushing teeth
'16': Chopping
'17': Grating
'18': Frying
'19': Sweeping
'20': Vacuuming
'21': Washing dishes
'22': Filling water
'23': Using microwave
- name: label_str
dtype: string
- name: participant
dtype: int32
splits:
- name: train
num_bytes: 29953937
num_examples: 670
download_size: 9340318
dataset_size: 29953937
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
valashir/super-mario-bros-levels | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: preprocessed_image
dtype: image
splits:
- name: train
num_bytes: 1698876.75
num_examples: 2098
download_size: 849061
dataset_size: 1698876.75
---
# Dataset Card for "super-mario-bros-levels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrfakename/librivox-full-catalog-archive | ---
license: cc0-1.0
---
# LibriVox Catalog Archive
Note: this archive does not include any audio files, but simply includes the catalog.
## What is LibriVox?
LibriVox is a catalog of free and public domain audiobooks. [Learn more...](https://librivox.org/)
Last updated: Sep 25, 2023 |
davidgaofc/RM_inout | ---
license: mit
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 791717
num_examples: 1640
download_size: 349585
dataset_size: 791717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Amirkid/stanford_alpaca | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 73322820
num_examples: 104004
download_size: 518089
dataset_size: 73322820
---
# Dataset Card for "stanford_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deojoandco/ah_full_dialog_annotation | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: string
- name: over_18
dtype: bool
- name: created_utc
dtype: float64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: string
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: query
dtype: string
- name: dialog
dtype: string
- name: dialog_success
dtype: bool
- name: __index_level_0__
dtype: float64
- name: annotation_error
dtype: bool
- name: annotation
struct:
- name: Error
dtype: string
- name: Success
dtype: bool
- name: success
dtype: bool
- name: text
dtype: string
- name: Error
dtype: bool
splits:
- name: train
num_bytes: 33886049
num_examples: 2921
download_size: 19222113
dataset_size: 33886049
---
# Dataset Card for "ah_full_dialog_annotation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mayjestro/LittleHodler | ---
license: c-uda
---
|
open-llm-leaderboard/details_yanolja__EEVE-Korean-Instruct-10.8B-v1.0 | ---
pretty_name: Evaluation run of yanolja/EEVE-Korean-Instruct-10.8B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yanolja/EEVE-Korean-Instruct-10.8B-v1.0](https://huggingface.co/yanolja/EEVE-Korean-Instruct-10.8B-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yanolja__EEVE-Korean-Instruct-10.8B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-24T20:26:58.872748](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__EEVE-Korean-Instruct-10.8B-v1.0/blob/main/results_2024-02-24T20-26-58.872748.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6423992666107647,\n\
\ \"acc_stderr\": 0.032076528166469165,\n \"acc_norm\": 0.6456042916393419,\n\
\ \"acc_norm_stderr\": 0.03272409578070873,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.540863060368421,\n\
\ \"mc2_stderr\": 0.015569038830817047\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670717,\n\
\ \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600938\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n\
\ \"acc_stderr\": 0.004788412062375688,\n \"acc_norm\": 0.8304122684724159,\n\
\ \"acc_norm_stderr\": 0.0037450326672282845\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334395,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334395\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.0255250343824749,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.0255250343824749\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.02261640942074202,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.02261640942074202\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.02622591986362928,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.02622591986362928\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"\
acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984806,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984806\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n\
\ \"acc_stderr\": 0.012769845366441194,\n \"acc_norm\": 0.49608865710560623,\n\
\ \"acc_norm_stderr\": 0.012769845366441194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.540863060368421,\n\
\ \"mc2_stderr\": 0.015569038830817047\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613992\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5072024260803639,\n \
\ \"acc_stderr\": 0.013771055751972872\n }\n}\n```"
repo_url: https://huggingface.co/yanolja/EEVE-Korean-Instruct-10.8B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|arc:challenge|25_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|gsm8k|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hellaswag|10_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T20-26-58.872748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-24T20-26-58.872748.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_24T20_26_58.872748
path:
- '**/details_harness|winogrande|5_2024-02-24T20-26-58.872748.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-24T20-26-58.872748.parquet'
- config_name: results
data_files:
- split: 2024_02_24T16_36_38.163475
path:
- results_2024-02-24T16-36-38.163475.parquet
- split: 2024_02_24T20_26_58.872748
path:
- results_2024-02-24T20-26-58.872748.parquet
- split: latest
path:
- results_2024-02-24T20-26-58.872748.parquet
---
# Dataset Card for Evaluation run of yanolja/EEVE-Korean-Instruct-10.8B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yanolja/EEVE-Korean-Instruct-10.8B-v1.0](https://huggingface.co/yanolja/EEVE-Korean-Instruct-10.8B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yanolja__EEVE-Korean-Instruct-10.8B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-24T20:26:58.872748](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__EEVE-Korean-Instruct-10.8B-v1.0/blob/main/results_2024-02-24T20-26-58.872748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6423992666107647,
"acc_stderr": 0.032076528166469165,
"acc_norm": 0.6456042916393419,
"acc_norm_stderr": 0.03272409578070873,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.540863060368421,
"mc2_stderr": 0.015569038830817047
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670717,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600938
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375688,
"acc_norm": 0.8304122684724159,
"acc_norm_stderr": 0.0037450326672282845
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334395,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.0255250343824749,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.0255250343824749
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.02261640942074202,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.02261640942074202
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.02622591986362928,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.02622591986362928
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984806,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984806
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441194,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.540863060368421,
"mc2_stderr": 0.015569038830817047
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613992
},
"harness|gsm8k|5": {
"acc": 0.5072024260803639,
"acc_stderr": 0.013771055751972872
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yehoon/arc_hella | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
sequence: string
- name: answer
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 8854506
num_examples: 12418
download_size: 5407350
dataset_size: 8854506
---
# Dataset Card for "arc_hella"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxm/msmarco__gtr_base__dpr | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float32
- name: embeddings_B
sequence: float32
splits:
- name: train
num_bytes: 647745307
num_examples: 100000
download_size: 757450091
dataset_size: 647745307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
leostelon/california-housing | ---
license: mit
---
|
CyberHarem/kris_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kris/クリス (Pokémon)
This is the dataset of kris/クリス (Pokémon), containing 425 images and their tags.
The core tags of this character are `twintails, hat, bangs, long_hair, blue_hair, green_hair, yellow_headwear, blue_eyes, green_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 425 | 289.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kris_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 425 | 209.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kris_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 683 | 354.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kris_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 425 | 271.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kris_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 683 | 440.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kris_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kris_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, bike_shorts, cropped_jacket, holding_poke_ball, long_sleeves, poke_ball_(basic), red_shirt, white_jacket, open_jacket, open_mouth, solo, :d, pokemon_(creature) |
| 1 | 13 |  |  |  |  |  | 1girl, red_shirt, white_jacket, simple_background, upper_body, eyelashes, open_jacket, white_background, cropped_jacket, solo, :d, blush, open_mouth, long_sleeves, ^_^, closed_mouth, tongue |
| 2 | 8 |  |  |  |  |  | 1girl, bike_shorts, holding_poke_ball, poke_ball_(basic), pokemon_(creature) |
| 3 | 11 |  |  |  |  |  | 1girl, bike_shorts, smile, pokemon_(creature), open_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, cosplay, hat_ribbon, overalls, red_ribbon, star_earrings, solo, cabbie_hat, smile, blush, poke_ball_(basic), thighhighs |
| 5 | 5 |  |  |  |  |  | 1girl, blush, solo, bike_shorts, one_eye_closed |
| 6 | 8 |  |  |  |  |  | 1girl, hetero, penis, completely_nude, vaginal, 1boy, ass, blush, open_mouth, anus, medium_breasts, nipples, testicles, barefoot, bestiality, cum_in_pussy, pokemon_(creature), pokephilia, solo_focus, looking_back, sex_from_behind |
| 7 | 27 |  |  |  |  |  | official_alternate_costume, 1girl, aqua_eyes, aqua_hair, aqua_dress, bare_shoulders, choker, smile, wrist_cuffs, small_breasts, medium_hair, hair_ornament, halter_dress, shorts_under_dress, collarbone, side_slit, pokemon_(creature), looking_at_viewer, sandals, solo, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bike_shorts | cropped_jacket | holding_poke_ball | long_sleeves | poke_ball_(basic) | red_shirt | white_jacket | open_jacket | open_mouth | solo | :d | pokemon_(creature) | simple_background | upper_body | eyelashes | white_background | blush | ^_^ | closed_mouth | tongue | smile | cosplay | hat_ribbon | overalls | red_ribbon | star_earrings | cabbie_hat | thighhighs | one_eye_closed | hetero | penis | completely_nude | vaginal | 1boy | ass | anus | medium_breasts | nipples | testicles | barefoot | bestiality | cum_in_pussy | pokephilia | solo_focus | looking_back | sex_from_behind | official_alternate_costume | aqua_eyes | aqua_hair | aqua_dress | bare_shoulders | choker | wrist_cuffs | small_breasts | medium_hair | hair_ornament | halter_dress | shorts_under_dress | collarbone | side_slit | looking_at_viewer | sandals |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------------|:--------------------|:---------------|:--------------------|:------------|:---------------|:--------------|:-------------|:-------|:-----|:---------------------|:--------------------|:-------------|:------------|:-------------------|:--------|:------|:---------------|:---------|:--------|:----------|:-------------|:-----------|:-------------|:----------------|:-------------|:-------------|:-----------------|:---------|:--------|:------------------|:----------|:-------|:------|:-------|:-----------------|:----------|:------------|:-----------|:-------------|:---------------|:-------------|:-------------|:---------------|:------------------|:-----------------------------|:------------|:------------|:-------------|:-----------------|:---------|:--------------|:----------------|:--------------|:----------------|:---------------|:---------------------|:-------------|:------------|:--------------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | X | | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | | X | | | | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 7 | 27 |  |  |  |  |  | X | | | | | | | | | | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
HuggingFaceM4/textvqa-Sample | Invalid username or password. |
lm468/kanji_meanings_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 55216884.34
num_examples: 6409
download_size: 65053174
dataset_size: 55216884.34
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sproos/twitter-pairclass-tr | ---
dataset_info:
features:
- name: sent1
sequence: string
- name: sent2
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11403288
num_examples: 1
download_size: 4721036
dataset_size: 11403288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter-pairclass-tr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rasu23/iapp_for_orpo | ---
dataset_info:
features:
- name: index_column
dtype: int64
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1218663506
num_examples: 120958
download_size: 116357612
dataset_size: 1218663506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aayush9753/InterIIT-Bosch-MidPrep-AgeGenderClassificationInCCTV | ---
license: afl-3.0
---
|
iansousa12/silvervoz2 | ---
license: openrail
---
|
benayas/atis_chatgpt_20pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 436304
num_examples: 4455
download_size: 151778
dataset_size: 436304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
saibo/synthie | ---
dataset_info:
features:
- name: id
dtype: int64
- name: entities
list:
- name: surfaceform
dtype: string
- name: uri
dtype: string
- name: relations
list:
- name: surfaceform
dtype: string
- name: uri
dtype: string
- name: triplets
list:
- name: object
struct:
- name: surfaceform
dtype: string
- name: uri
dtype: string
- name: predicate
struct:
- name: surfaceform
dtype: string
- name: uri
dtype: string
- name: subject
struct:
- name: surfaceform
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: linearized_fully_expanded
dtype: string
- name: linearized_subject_collapsed
dtype: string
splits:
- name: test_small_1k
num_bytes: 1085714
num_examples: 1000
- name: test_small
num_bytes: 11156829
num_examples: 10000
- name: val
num_bytes: 11098777
num_examples: 10000
download_size: 11287504
dataset_size: 23341320
configs:
- config_name: default
data_files:
- split: test_small_1k
path: data/test_small_1k-*
- split: test_small
path: data/test_small-*
- split: val
path: data/val-*
---
# SynthIE(subset)
This is a part of the original Synthie dataset available at https://huggingface.co/datasets/martinjosifoski/SynthIE. It specifically includes only the [synthie-text_davinci_003](https://huggingface.co/datasets/martinjosifoski/SynthIE/tree/main/sdg_text_davinci_003) portion.
The `test_small_1k` split represents the initial 1000 records from the `test_small` segment. Since `test_small` was randomly arranged, there was no need for additional shuffling; we simply selected the first 1000 records. |
sethapun/arithmetic_2all_1to750 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 61408
num_examples: 2000
- name: validation
num_bytes: 12266
num_examples: 400
download_size: 33411
dataset_size: 73674
---
# Dataset Card for "arithmetic_2all_1to750"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/honoka_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honoka/ほのか/穗香 (Azur Lane)
This is the dataset of honoka/ほのか/穗香 (Azur Lane), containing 274 images and their tags.
The core tags of this character are `breasts, one_side_up, pink_hair, large_breasts, hair_ornament, long_hair, red_eyes, skull_hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 274 | 376.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honoka_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 274 | 205.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honoka_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 681 | 446.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honoka_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 274 | 327.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honoka_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 681 | 639.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honoka_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/honoka_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, collarbone, looking_at_viewer, pink_eyes, simple_background, solo, white_background, underboob, bare_arms, closed_mouth, navel, upper_body, yellow_bikini, smile, stomach |
| 1 | 14 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, navel, solo, looking_at_viewer, rock, side-tie_bikini_bottom, stomach, thighs, wet, yellow_bikini, pink_eyes, blush, sitting, cleavage, water, bikini_pull, outdoors, underboob, parted_lips, skindentation, bare_arms, strap_pull, string_bikini |
| 2 | 6 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, smile, white_bikini, simple_background, sitting, white_background, arm_support, blush |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, cleavage, detached_sleeves, white_thighhighs, wide_sleeves, hakama_short_skirt, miko, open_mouth, red_hakama, simple_background, white_background, blush, collarbone, maple_leaf, pink_eyes, smile, cowboy_shot, hip_vent, red_skirt, zettai_ryouiki |
| 4 | 13 |  |  |  |  |  | 1girl, bikini_top_only, looking_at_viewer, solo, jacket, choker, fingerless_gloves, navel, black_bikini, cleavage, blush, open_clothes, short_shorts, belt, torn_thighhighs, black_gloves, collar, black_thighhighs, simple_background, smile, unzipped |
| 5 | 9 |  |  |  |  |  | 1girl, hetero, nipples, open_mouth, penis, solo_focus, 1boy, blush, pussy, sex, sweat, completely_nude, navel, smile, vaginal, cum, mosaic_censoring, spread_legs |
| 6 | 8 |  |  |  |  |  | 1girl, smile, solo, looking_at_viewer, school_uniform, blazer, plaid_skirt, single_glove, black_gloves, pleated_skirt, red_necktie, simple_background, blush, white_background, white_shirt, white_thighhighs, cleavage, huge_breasts, open_mouth, underwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | cleavage | collarbone | looking_at_viewer | pink_eyes | simple_background | solo | white_background | underboob | bare_arms | closed_mouth | navel | upper_body | yellow_bikini | smile | stomach | rock | side-tie_bikini_bottom | thighs | wet | sitting | water | bikini_pull | outdoors | parted_lips | skindentation | strap_pull | string_bikini | white_bikini | arm_support | detached_sleeves | white_thighhighs | wide_sleeves | hakama_short_skirt | miko | open_mouth | red_hakama | maple_leaf | cowboy_shot | hip_vent | red_skirt | zettai_ryouiki | bikini_top_only | jacket | choker | fingerless_gloves | black_bikini | open_clothes | short_shorts | belt | torn_thighhighs | black_gloves | collar | black_thighhighs | unzipped | hetero | nipples | penis | solo_focus | 1boy | pussy | sex | sweat | completely_nude | vaginal | cum | mosaic_censoring | spread_legs | school_uniform | blazer | plaid_skirt | single_glove | pleated_skirt | red_necktie | white_shirt | huge_breasts | underwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-----------|:-------------|:--------------------|:------------|:--------------------|:-------|:-------------------|:------------|:------------|:---------------|:--------|:-------------|:----------------|:--------|:----------|:-------|:-------------------------|:---------|:------|:----------|:--------|:--------------|:-----------|:--------------|:----------------|:-------------|:----------------|:---------------|:--------------|:-------------------|:-------------------|:---------------|:---------------------|:-------|:-------------|:-------------|:-------------|:--------------|:-----------|:------------|:-----------------|:------------------|:---------|:---------|:--------------------|:---------------|:---------------|:---------------|:-------|:------------------|:---------------|:---------|:-------------------|:-----------|:---------|:----------|:--------|:-------------|:-------|:--------|:------|:--------|:------------------|:----------|:------|:-------------------|:--------------|:-----------------|:---------|:--------------|:---------------|:----------------|:--------------|:--------------|:---------------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | | X | | X | X | X | | | | | | | X | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | X | X | | X | | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | X | X | | X | | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded | ---
pretty_name: Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pythainlp/wangchanglm-7.5B-sft-en-sharded](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T12:19:58.207629](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded/blob/main/results_2023-10-12T12-19-58.207629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13527684563758388,\n\
\ \"em_stderr\": 0.003502595047728489,\n \"f1\": 0.1918613674496648,\n\
\ \"f1_stderr\": 0.003673521698384984,\n \"acc\": 0.29237637276332257,\n\
\ \"acc_stderr\": 0.007586068039653844\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13527684563758388,\n \"em_stderr\": 0.003502595047728489,\n\
\ \"f1\": 0.1918613674496648,\n \"f1_stderr\": 0.003673521698384984\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674378\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5824782951854776,\n \"acc_stderr\": 0.013859978264440251\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T12_19_58.207629
path:
- '**/details_harness|drop|3_2023-10-12T12-19-58.207629.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T12-19-58.207629.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T12_19_58.207629
path:
- '**/details_harness|gsm8k|5_2023-10-12T12-19-58.207629.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T12-19-58.207629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:12.796428.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:12.796428.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T12_19_58.207629
path:
- '**/details_harness|winogrande|5_2023-10-12T12-19-58.207629.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T12-19-58.207629.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_39_12.796428
path:
- results_2023-07-19T15:39:12.796428.parquet
- split: 2023_10_12T12_19_58.207629
path:
- results_2023-10-12T12-19-58.207629.parquet
- split: latest
path:
- results_2023-10-12T12-19-58.207629.parquet
---
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-en-sharded](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T12:19:58.207629](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded/blob/main/results_2023-10-12T12-19-58.207629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13527684563758388,
"em_stderr": 0.003502595047728489,
"f1": 0.1918613674496648,
"f1_stderr": 0.003673521698384984,
"acc": 0.29237637276332257,
"acc_stderr": 0.007586068039653844
},
"harness|drop|3": {
"em": 0.13527684563758388,
"em_stderr": 0.003502595047728489,
"f1": 0.1918613674496648,
"f1_stderr": 0.003673521698384984
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674378
},
"harness|winogrande|5": {
"acc": 0.5824782951854776,
"acc_stderr": 0.013859978264440251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kitagawa_mahiro_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitagawa_mahiro/北川真尋 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kitagawa_mahiro/北川真尋 (THE iDOLM@STER: Cinderella Girls), containing 29 images and their tags.
The core tags of this character are `brown_hair, glasses, short_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 30.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitagawa_mahiro_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 20.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitagawa_mahiro_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 57 | 34.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitagawa_mahiro_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 27.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitagawa_mahiro_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 57 | 44.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitagawa_mahiro_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kitagawa_mahiro_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, jewelry, smile, microphone, solo, fingerless_gloves, midriff, confetti, flower |
| 1 | 8 |  |  |  |  |  | 1girl, solo, smile, card_(medium), character_name, midriff, navel, orange_background, sun_symbol, bag, one_eye_closed, open_mouth, plaid, school_uniform, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | smile | microphone | solo | fingerless_gloves | midriff | confetti | flower | card_(medium) | character_name | navel | orange_background | sun_symbol | bag | one_eye_closed | open_mouth | plaid | school_uniform | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------|:-------------|:-------|:--------------------|:----------|:-----------|:---------|:----------------|:-----------------|:--------|:--------------------|:-------------|:------|:-----------------|:-------------|:--------|:-----------------|:--------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
adihaviv/idiomem | ---
license: mit
---
|
Tele-AI/TeleChat-PTD | ---
license: apache-2.0
viewer: false
---
<div align="center">
<h1>
TeleChat预训练数据集(TeleChat-PTD)
</h1>
</div>
<p align="center">
🤗 <a href="https://huggingface.co/Tele-AI" target="_blank">Hugging Face</a> • 🏔 <a href="" target="_blank">MindSpore</a>️ • 🦉 <a href="https://github.com/Tele-AI/Telechat" target="_blank">github</a>️ • 🐾 <a href="https://gitee.com/Tele-AI/tele-chat" target="_blank">gitee</a>️ • 💬 <a href="https://github.com/Tele-AI/Telechat/blob/master/images/wechat.jpg" target="_blank">WeChat</a>
</p>
<p align="center">
<a href="https://arxiv.org/abs/2401.03804" target="_blank"> Tech Report </a>
</p>
# 数据介绍
TeleChat-PTD 是由电信星辰大模型**TeleChat**预训练语料中抽取出的的综合性大规模中文数据集。数据主要来源于网页、书籍、官方媒体等。 我们使用规则+模型的方式进行了相关的过滤,并对数据进行了相似性去重,尽可能地提取出高质量地数据。
TeleChat-PTD 数据集大约公开了2.7亿条数据,数据由纯中文文本构成,原始大小约1TB,压缩后480G,共189个文件。数据集中已经去除了其它冗余信息。
# 数据下载
huggingface下载地址:[数据下载](https://huggingface.co/datasets/Tele-AI/TeleChat-PTD)
天翼云盘下载地址:[数据下载](https://cloud.189.cn/t/ia2QbaVzYf6z)(访问码:pkg8)
# 数据格式
数据为jsonl格式,仅有一个字段data: 单条处理后的预训练数据
# 数据清洗
数据清洗的工作流程主要是:规则筛选和清洗、去重、高质量数据筛选、数据安全处理这四个步骤。
- 规则筛选主要是一些通用的规则和启发式规则,例如对字数长度的筛选等等。
- 去重主要使用相似度去重来将过于相似重复的数据删除
- 高质量筛选主要使用了BERT、GPT2等模型对数据进行打分筛选出高质量数据
- 数据清洗主要是针对不良数据进行了识别和去除。
# 声明、协议、引用
### 声明
我们在此声明,不要使用TeleChat模型及其衍生模型进行任何危害国家社会安全或违法的活动。同时,我们也要求使用者不要将TeleChat模型用于没有安全审查和备案的互联网服务。我们希望所有使用者遵守上述原则,确保科技发展在合法合规的环境下进行。
我们已经尽我们所能,来确保模型训练过程中使用的数据的合规性。然而,尽管我们已经做出了巨大的努力,但由于模型和数据的复杂性,仍有可能存在一些无法预见的问题。因此,如果由于使用TeleChat开源模型而导致的任何问题,包括但不限于数据安全问题、公共舆论风险,或模型被误导、滥用、传播或不当利用所带来的任何风险和问题,我们将不承担任何责任。
### 协议
社区使用 TeleChat 模型需要遵循《[TeleChat模型社区许可协议](./TeleChat模型社区许可协议.pdf)》。TeleChat模型支持商业用途,如果您计划将 TeleChat 模型或其衍生品用于商业目的,您需要通过以下联系邮箱 tele_ai@chinatelecom.cn,提交《TeleChat模型社区许可协议》要求的申请材料。审核通过后,将特此授予您一个非排他性、全球性、不可转让、不可再许可、可撤销的商用版权许可。
### 引用
如需引用我们的工作,请使用如下 reference:
```
@misc{wang2024telechat,
title={TeleChat Technical Report},
author={Zihan Wang and Xinzhang Liu and Shixuan Liu and Yitong Yao and Yuyao Huang and Zhongjiang He and Xuelong Li and Yongxiang Li and Zhonghao Che and Zhaoxi Zhang and Yan Wang and Xin Wang and Luwen Pu and Huihan Xu and Ruiyu Fang and Yu Zhao and Jie Zhang and Xiaomeng Huang and Zhilong Lu and Jiaxin Peng and Wenjun Zheng and Shiquan Wang and Bingkai Yang and Xuewei he and Zhuoru Jiang and Qiyi Xie and Yanhan Zhang and Zhongqiu Li and Lingling Shi and Weiwei Fu and Yin Zhang and Zilu Huang and Sishi Xiong and Yuxiang Zhang and Chao Wang and Shuangyong Song},
year={2024},
eprint={2401.03804},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
davidberenstein1957/ray-summit-classy | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': BUSINESS
'1': SCI/TECH
'2': SPORTS
'3': WORLD
splits:
- name: train
num_bytes: 111748.62132352941
num_examples: 435
- name: test
num_bytes: 28001.378676470587
num_examples: 109
download_size: 97950
dataset_size: 139750.0
---
# Dataset Card for "ray-summit-classy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kayile/Jimmy_Valmer_50epoch | ---
language:
- en
pretty_name: Jimmy Valmer (RVC 50 Epoch)
---
<b>Jimmy Valmer (RVC 50 Epoch)</b>
Basically, this was made using all the clean samples of Jimmy Valmer from South Park I could find;
The RVC is pretty much a demo as I'll make a better one for sure but so far I'm proud of it!! <3
<b>JUST SOME NOTICES</b>
A.You have to credit me for it
B.It's the old Jimmy voice, from season 5, here's the refference I went with when finding the samples:
<audio controls src="https://s3.amazonaws.com/moonup/production/uploads/6496f43cf0612dfd53b5395e/12IvuvBpX0iA1fnJI7Sx7.wav"></audio>
C.It is slightly more deep than the actual Jimmy so you might want to lower the index on female voices to 0.9 or 0.8
D.-12 For female voices, 0 for male voices (If the voice is deepened, +12)

<b>example: "Running out of time" by Tyler, the creator</b>
<audio controls src="https://s3.amazonaws.com/moonup/production/uploads/6496f43cf0612dfd53b5395e/tJo65F0ntTRKUSq8VrWve.mpga"></audio>
|
316usman/lse | ---
dataset_info:
features:
- name: text
dtype: string
- name: scope
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 99789946
num_examples: 154814
download_size: 36347876
dataset_size: 99789946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-7b7f8a-16126221 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: google/bigbird-pegasus-large-pubmed
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-pubmed
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
eduagarcia/cc_news_pt | ---
pretty_name: CC-News-PT
annotations_creators:
- no-annotation
language_creators:
- found
language:
- pt
license:
- unknown
size_categories:
- 1B<n<10B
task_categories:
- text-generation
- fill-mask
- text2text-generation
task_ids:
- language-modeling
- masked-language-modeling
---
### Dataset Summary
CC-News-PT is a curation of news articles from CommonCrawl News in the Portuguese language.
CommonCrawl News is a dataset containing news articles from news sites all over the world.
The data is available on AWS S3 in the Common Crawl bucket at /crawl-data/CC-NEWS/.
This version of the dataset is the portuguese subset from [CloverSearch/cc-news-mutlilingual](https://huggingface.co/datasets/CloverSearch/cc-news-mutlilingual).
### Data Fields
- `title`: a `string` feature.
- `text`: a `string` feature.
- `authors`: a `string` feature.
- `domain`: a `string` feature.
- `date`: a `string` feature.
- `description`: a `string` feature.
- `url`: a `string` feature.
- `image_url`: a `string` feature.
- `date_download`: a `string` feature.
### How to use this dataset
```python
from datasets import load_dataset
dataset = load_dataset("eduagarcia/cc_news_pt", split="train")
```
### Cite
```
@misc{Acerola2023,
author = {Garcia, E.A.S.},
title = {Acerola Corpus: Towards Better Portuguese Language Models},
year = {2023},
doi = {10.57967/hf/0814}
}
``` |
maywell/koVast | ---
license: other
license_name: kovast
license_link: LICENSE
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 1047538413
num_examples: 684579
download_size: 470686367
dataset_size: 1047538413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# **Massive Korean Multi-Turn Dataset**
본 데이터를 사용하여 훈련한 모델의 경우 해당 데이터 사용을 **반드시** 명시해야합니다. (모델 서빙시에도 동일하게 적용.)
## Thanks to
- A100 클러스터를 제공해주신, [Sionic AI](https://sionic.ai/)
## Contact
- [Discord Server Link](https://discord.gg/MrBt3PXdXc) |
crewdon/FormulasMax500 | ---
dataset_info:
config_name: crewdon
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 35055132
num_examples: 154634
download_size: 6463417
dataset_size: 35055132
configs:
- config_name: crewdon
data_files:
- split: train
path: crewdon/train-*
---
# Dataset Card for "FormulasMax500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datahrvoje/twitter_dataset_1712965164 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22668
num_examples: 51
download_size: 12861
dataset_size: 22668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AndrewTsai0406/CRUD_RAG_3QA | ---
dataset_info:
features:
- name: id
dtype: string
- name: event
dtype: string
- name: news1
dtype: string
- name: news2
dtype: string
- name: news3
dtype: string
- name: thoughts
dtype: string
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 17142073
num_examples: 3199
download_size: 10160464
dataset_size: 17142073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mou3az/Question-Answering-Generation-Choices | ---
license: apache-2.0
task_categories:
- question-answering
- text2text-generation
- text-generation
- fill-mask
language:
- en
size_categories:
- 10K<n<100K
---
# The dataset is a merged compilation of QuAIL, RACE, and Cosmos QA datasets,
# having undergone preprocessing. |
Gael540/dataSet_ens_sup_fr-v1 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- fr
tags:
- legal
pretty_name: >-
Data set sur la compéhension de l'enseignement supérieur avec un développment
particulier sur les B.U.T
size_categories:
- 100K<n<1M
--- |
marvmk/scalableMLDL2 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 5726523552
num_examples: 5962
- name: test
num_bytes: 2546311152
num_examples: 2651
download_size: 1397392104
dataset_size: 8272834704
---
# Dataset Card for "scalableMLDL2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-us_foreign_policy-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 42692
num_examples: 100
download_size: 28193
dataset_size: 42692
---
# Dataset Card for "mmlu-us_foreign_policy-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_zh_train | ---
pretty_name: '`mmarco/zh/train`'
viewer: false
source_datasets: ['irds/mmarco_zh']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/zh/train`
The `mmarco/zh/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/zh/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_zh`](https://huggingface.co/datasets/irds/mmarco_zh)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_zh_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_zh_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_zh_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
DataProvenanceInitiative/niv2_submix_original | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 13104211362
num_examples: 10066896
download_size: 7612945130
dataset_size: 13104211362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "niv2_submix_original"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/duran-duran | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/duran-duran"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.414706 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/95697394e4f58c9aa507e408f51008db.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/duran-duran">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Duran Duran</div>
<a href="https://genius.com/artists/duran-duran">
<div style="text-align: center; font-size: 14px;">@duran-duran</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/duran-duran).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/duran-duran")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|360| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/duran-duran")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Thefoodprocessor/meal_type | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: recipe
dtype: string
- name: meal_type_title
dtype: string
splits:
- name: train
num_bytes: 107900952
num_examples: 74465
download_size: 54288492
dataset_size: 107900952
---
# Dataset Card for "meal_type"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF | ---
pretty_name: Evaluation run of FPHam/Free_Sydney_13b_HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T05:42:30.698824](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-10-15T05-42-30.698824.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268446,\n \"f1\": 0.06131187080536917,\n\
\ \"f1_stderr\": 0.0013635599924355774,\n \"acc\": 0.4258996525195177,\n\
\ \"acc_stderr\": 0.009976510388912537\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268446,\n\
\ \"f1\": 0.06131187080536917,\n \"f1_stderr\": 0.0013635599924355774\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.007950942148339331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485742\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FPHam/Free_Sydney_13b_HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T05_42_30.698824
path:
- '**/details_harness|drop|3_2023-10-15T05-42-30.698824.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T05-42-30.698824.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T05_42_30.698824
path:
- '**/details_harness|gsm8k|5_2023-10-15T05-42-30.698824.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T05-42-30.698824.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:56:58.779734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T05_42_30.698824
path:
- '**/details_harness|winogrande|5_2023-10-15T05-42-30.698824.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T05-42-30.698824.parquet'
- config_name: results
data_files:
- split: 2023_07_25T10_56_58.779734
path:
- results_2023-07-25T10:56:58.779734.parquet
- split: 2023_10_15T05_42_30.698824
path:
- results_2023-10-15T05-42-30.698824.parquet
- split: latest
path:
- results_2023-10-15T05-42-30.698824.parquet
---
# Dataset Card for Evaluation run of FPHam/Free_Sydney_13b_HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FPHam/Free_Sydney_13b_HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FPHam/Free_Sydney_13b_HF](https://huggingface.co/FPHam/Free_Sydney_13b_HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T05:42:30.698824](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Free_Sydney_13b_HF/blob/main/results_2023-10-15T05-42-30.698824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268446,
"f1": 0.06131187080536917,
"f1_stderr": 0.0013635599924355774,
"acc": 0.4258996525195177,
"acc_stderr": 0.009976510388912537
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268446,
"f1": 0.06131187080536917,
"f1_stderr": 0.0013635599924355774
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339331
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.012002078629485742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
acon96/Home-Assistant-Requests | ---
license: mit
task_categories:
- question-answering
- text-generation
tags:
- automation
- home
- assistant
language:
- en
pretty_name: Home Assistant Requests
size_categories:
- 10K<n<100k
---
# Home Assistant Requests Dataset
This dataset contains a list of requests and responses for a user interacting with a personal assistant that controls an instance of [Home Assistant](https://www.home-assistant.io/).
The dataset is generated from the different CSV "piles". The "piles" contain different chunks of requests that are assembled into a final context that is presented to the LLM. For example, `piles/pile_of_device_names.csv` contains only names of various devices to be used as part of context as well as inserted into `piles/pile_of_templated_actions.csv` and `piles/pile_of_status_requests.csv`. The logic for assembling the final dataset from the piles is contained in [generate_home_assistant_data.py](./generate_home_assistant_data.py).
## Generating the dataset from piles
`python3 generate_home_assistant_data.py --train --test --large --sharegpt`
Supported dataset splits are `--test`, `--train`, & `--sample`
Arguments to set the train dataset size are `--small`, `--medium`, `--large`, & `--xl`.
Supported formats are `--raw_corpus` (chatml formatted) & `--sharegpt`
## Merging with other instruct-datasets for training
`python3 generate_home_assistant_data.py --merge <dataset>`
Supported datasets right now are:
- `alpaca`
- `wizardlm70k`
Please note that the supported datasets all have different licenses. Be aware that the license of the resulting data mixture might be different that the license of this dataset alone.
## Adding a new personality
In order to add a new personality, you need to define a new system prompt and new set of responses for the assistant. The system prompt is the description of the assistant's behavior that occurs at the start of the context. The responses are what is said back to the user when performing a task. The model should stil respond with the correct service call no matter what the assistant's response is. The list of system prompts are stored in `pile_of_system_prompts.csv`, and the list of responses are stored in `pile_of_responses.csv`
There are 2 columns in `pile_of_system_prompts.csv`:
- `persona`: the name of the persona
- `prompt`: the system prompt to use for that persona. Recommended to put this in quotes in case the prompt also has commas in it
The response pile is a CSV with the following headers: `service,response,language,persona,short`
- `service`: the service name that we are responding to. Make sure you cover enough different services so that the model can learn how to respond in all situations.
- `resposne`: the text of the repsonse. Recommended to put this in quotes in case the response also has commas in it
- `language`: the language code of the response (currently only `en` is supported)
- `persona`: the name of the persona the response belongs to. Use the name of your persona here
- `short`: either 0 or 1. If it is 1 then the response is considered "short', and can be combined together with other "short" repsonses using "and". These are used for examples where there are multiple service calls
Generating the full dataset using the python script will print out a warning for any responses that are missing for a persona
## Adding new Home Assistant functionality
TODO
<!-- In order to add new home assistant device types, you will need to add data to a handful of piles, as well as make small modifications to the `generate_home_assistant_data.py` script.
1. Add 15-30 new device names with the new type to the `pile_of_device_names.csv`. This should be an entity_id and a 'friendly name'
2. Add
-->
|
NaolHF/train | ---
license: apache-2.0
---
|
ruliad/factual-expert-processed-v3-packed | ---
dataset_info:
features:
- name: text
dtype: string
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 17899758506
num_examples: 514534
download_size: 10472765491
dataset_size: 17899758506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KGBrain/visual_files | ---
pretty_name: classification
size_categories:
- n<1K
--- |
Nexdata/57645_Images_Vertical_OCR_Data_in_Text_Scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
57,645 Images - Vertical OCR Data in Text Scenes. The collecting scenes of this dataset include street scenes, plaques, billboards, posters, decorations, art lettering, magazine covers etc. The language distribution includes Chinese and a few English. In this dataset, vertical -level rectangular bounding box (polygonal bounding box, parallelogram bounding box) annotation and transcription for the texts; non-vertical rectangular bounding box (polygonal bounding box, parallelogram bounding box) annotation and transcription for the texts. This dataset can be used for tasks such as multiple vertical text scenes OCR.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1226?source=Huggingface
## Data size
57,645 images, 528,553 bounding boxes
## Collecting environment
including street scenes, plaques, billboards, posters, decorations, art lettering, magazine covers etc.
## Data diversity
multiple scenes, multiple fonts
## Language distribution
Chinese, English (a few)
## Bounding box direction distribution
324,399 vertical bounding boxes, 204,154 non-vertical bounding boxes
## Bounding box shape distribution
34,936 rectangular bounding boxes, 220,716 polygonal bounding boxes, 272,901 parallelogram bounding boxes
## Data format
the image data format is .jpg, the annotation file format is .json
## Annotation content
vertical -level rectangular bounding box (polygonal bounding box, parallelogram bounding box) annotation and transcription for the texts; non-vertical rectangular bounding box (polygonal bounding box, parallelogram bounding box) annotation and transcription for the texts
## Accuracy
The error bound of each vertex of a bounding box is within 3 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 97%; The texts transcription accuracy is not less than 97%.
# Licensing Information
Commercial License
|
NeuML/wikipedia | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Wikipedia
paperswithcode_id: null
license:
- cc-by-sa-3.0
- gfdl
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
language:
- aa
- ab
- ace
- af
- ak
- als
- am
- an
- ang
- ar
- arc
- arz
- as
- ast
- atj
- av
- ay
- az
- azb
- ba
- bar
- bcl
- be
- bg
- bh
- bi
- bjn
- bm
- bn
- bo
- bpy
- br
- bs
- bug
- bxr
- ca
- cbk
- cdo
- ce
- ceb
- ch
- cho
- chr
- chy
- ckb
- co
- cr
- crh
- cs
- csb
- cu
- cv
- cy
- da
- de
- din
- diq
- dsb
- dty
- dv
- dz
- ee
- el
- eml
- en
- eo
- es
- et
- eu
- ext
- fa
- ff
- fi
- fj
- fo
- fr
- frp
- frr
- fur
- fy
- ga
- gag
- gan
- gd
- gl
- glk
- gn
- gom
- gor
- got
- gu
- gv
- ha
- hak
- haw
- he
- hi
- hif
- ho
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ig
- ii
- ik
- ilo
- inh
- io
- is
- it
- iu
- ja
- jam
- jbo
- jv
- ka
- kaa
- kab
- kbd
- kbp
- kg
- ki
- kj
- kk
- kl
- km
- kn
- ko
- koi
- krc
- ks
- ksh
- ku
- kv
- kw
- ky
- la
- lad
- lb
- lbe
- lez
- lfn
- lg
- li
- lij
- lmo
- ln
- lo
- lrc
- lt
- ltg
- lv
- lzh
- mai
- mdf
- mg
- mh
- mhr
- mi
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mus
- mwl
- my
- myv
- mzn
- na
- nah
- nan
- nap
- nds
- ne
- new
- ng
- nl
- nn
- 'no'
- nov
- nrf
- nso
- nv
- ny
- oc
- olo
- om
- or
- os
- pa
- pag
- pam
- pap
- pcd
- pdc
- pfl
- pi
- pih
- pl
- pms
- pnb
- pnt
- ps
- pt
- qu
- rm
- rmy
- rn
- ro
- ru
- rue
- rup
- rw
- sa
- sah
- sat
- sc
- scn
- sco
- sd
- se
- sg
- sgs
- sh
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- srn
- ss
- st
- stq
- su
- sv
- sw
- szl
- ta
- tcy
- tdt
- te
- tg
- th
- ti
- tk
- tl
- tn
- to
- tpi
- tr
- ts
- tt
- tum
- tw
- ty
- tyv
- udm
- ug
- uk
- ur
- uz
- ve
- vec
- vep
- vi
- vls
- vo
- vro
- wa
- war
- wo
- wuu
- xal
- xh
- xmf
- yi
- yo
- yue
- za
- zea
- zh
- zu
language_bcp47:
- nds-nl
config_names:
- 20240101.aa
- 20220101.ab
- 20240101.ace
- 20240101.ady
- 20240101.af
- 20240101.ak
- 20240101.als
- 20240101.am
- 20240101.an
- 20240101.ang
- 20240101.ar
- 20240101.arc
- 20240101.arz
- 20240101.as
- 20240101.ast
- 20240101.atj
- 20240101.av
- 20240101.ay
- 20240101.az
- 20240101.azb
- 20240101.ba
- 20240101.bar
- 20240101.bat-smg
- 20240101.bcl
- 20240101.be
- 20240101.be-x-old
- 20240101.bg
- 20240101.bh
- 20240101.bi
- 20240101.bjn
- 20240101.bm
- 20240101.bn
- 20240101.bo
- 20240101.bpy
- 20240101.br
- 20240101.bs
- 20240101.bug
- 20240101.bxr
- 20240101.ca
- 20240101.cbk-zam
- 20240101.cdo
- 20240101.ce
- 20240101.ceb
- 20240101.ch
- 20240101.cho
- 20240101.chr
- 20240101.chy
- 20240101.ckb
- 20240101.co
- 20240101.cr
- 20240101.crh
- 20240101.cs
- 20240101.csb
- 20240101.cu
- 20240101.cv
- 20240101.cy
- 20240101.da
- 20240101.de
- 20240101.din
- 20240101.diq
- 20240101.dsb
- 20240101.dty
- 20240101.dv
- 20240101.dz
- 20240101.ee
- 20240101.el
- 20240101.eml
- 20240101.en
- 20240101.eo
- 20240101.es
- 20240101.et
- 20240101.eu
- 20240101.ext
- 20240101.fa
- 20240101.ff
- 20240101.fi
- 20240101.fiu-vro
- 20240101.fj
- 20240101.fo
- 20240101.fr
- 20240101.frp
- 20240101.frr
- 20240101.fur
- 20240101.fy
- 20240101.ga
- 20240101.gag
- 20240101.gan
- 20240101.gd
- 20240101.gl
- 20240101.glk
- 20240101.gn
- 20240101.gom
- 20240101.gor
- 20240101.got
- 20240101.gu
- 20240101.gv
- 20240101.ha
- 20240101.hak
- 20240101.haw
- 20240101.he
- 20240101.hi
- 20240101.hif
- 20240101.ho
- 20240101.hr
- 20240101.hsb
- 20240101.ht
- 20240101.hu
- 20240101.hy
- 20240101.ia
- 20240101.id
- 20240101.ie
- 20240101.ig
- 20240101.ii
- 20240101.ik
- 20240101.ilo
- 20240101.inh
- 20240101.io
- 20240101.is
- 20240101.it
- 20240101.iu
- 20240101.ja
- 20240101.jam
- 20240101.jbo
- 20240101.jv
- 20240101.ka
- 20240101.kaa
- 20240101.kab
- 20240101.kbd
- 20240101.kbp
- 20240101.kg
- 20240101.ki
- 20240101.kj
- 20240101.kk
- 20240101.kl
- 20240101.km
- 20240101.kn
- 20240101.ko
- 20240101.koi
- 20240101.krc
- 20240101.ks
- 20240101.ksh
- 20240101.ku
- 20240101.kv
- 20240101.kw
- 20240101.ky
- 20240101.la
- 20240101.lad
- 20240101.lb
- 20240101.lbe
- 20240101.lez
- 20240101.lfn
- 20240101.lg
- 20240101.li
- 20240101.lij
- 20240101.lmo
- 20240101.ln
- 20240101.lo
- 20240101.lrc
- 20240101.lt
- 20240101.ltg
- 20240101.lv
- 20240101.mai
- 20240101.map-bms
- 20240101.mdf
- 20240101.mg
- 20240101.mh
- 20240101.mhr
- 20240101.mi
- 20240101.min
- 20240101.mk
- 20240101.ml
- 20240101.mn
- 20240101.mr
- 20240101.mrj
- 20240101.ms
- 20240101.mt
- 20240101.mus
- 20240101.mwl
- 20240101.my
- 20240101.myv
- 20240101.mzn
- 20240101.na
- 20240101.nah
- 20240101.nap
- 20240101.nds
- 20240101.nds-nl
- 20240101.ne
- 20240101.new
- 20240101.ng
- 20240101.nl
- 20240101.nn
- 20240101.no
- 20240101.nov
- 20240101.nrm
- 20240101.nso
- 20240101.nv
- 20240101.ny
- 20240101.oc
- 20240101.olo
- 20240101.om
- 20240101.or
- 20240101.os
- 20240101.pa
- 20240101.pag
- 20240101.pam
- 20240101.pap
- 20240101.pcd
- 20240101.pdc
- 20240101.pfl
- 20240101.pi
- 20240101.pih
- 20240101.pl
- 20240101.pms
- 20240101.pnb
- 20240101.pnt
- 20240101.ps
- 20240101.pt
- 20240101.qu
- 20240101.rm
- 20240101.rmy
- 20240101.rn
- 20240101.ro
- 20240101.roa-rup
- 20240101.roa-tara
- 20240101.ru
- 20240101.rue
- 20240101.rw
- 20240101.sa
- 20240101.sah
- 20240101.sat
- 20240101.sc
- 20240101.scn
- 20240101.sco
- 20240101.sd
- 20240101.se
- 20240101.sg
- 20240101.sh
- 20240101.si
- 20240101.simple
- 20240101.sk
- 20240101.sl
- 20240101.sm
- 20240101.sn
- 20240101.so
- 20240101.sq
- 20240101.sr
- 20240101.srn
- 20240101.ss
- 20240101.st
- 20240101.stq
- 20240101.su
- 20240101.sv
- 20240101.sw
- 20240101.szl
- 20240101.ta
- 20240101.tcy
- 20240101.te
- 20240101.tet
- 20240101.tg
- 20240101.th
- 20240101.ti
- 20240101.tk
- 20240101.tl
- 20240101.tn
- 20240101.to
- 20240101.tpi
- 20240101.tr
- 20240101.ts
- 20240101.tt
- 20240101.tum
- 20240101.tw
- 20240101.ty
- 20240101.tyv
- 20240101.udm
- 20240101.ug
- 20240101.uk
- 20240101.ur
- 20240101.uz
- 20240101.ve
- 20240101.vec
- 20240101.vep
- 20240101.vi
- 20240101.vls
- 20240101.vo
- 20240101.wa
- 20240101.war
- 20240101.wo
- 20240101.wuu
- 20240101.xal
- 20240101.xh
- 20240101.xmf
- 20240101.yi
- 20240101.yo
- 20240101.za
- 20240101.zea
- 20240101.zh
- 20240101.zh-classical
- 20240101.zh-min-nan
- 20240101.zh-yue
- 20240101.zu
---
# Dataset Card for Wikipedia
This repo is a fork of the [olm/wikipedia](https://huggingface.co/datasets/olm/wikipedia) repo which itself is a fork of the original Hugging Face Wikipedia repo [here](https://huggingface.co/datasets/wikipedia).
This fork modifies `olm/wikipedia` to enable running on lower resourced machines. These changes have been proposed as a [PR with the olm/wikipedia project](https://huggingface.co/datasets/olm/wikipedia/discussions/6).
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (https://dumps.wikimedia.org/) with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).
The articles are parsed using the ``mwparserfromhell`` tool.
To load this dataset you need to install the following dependencies:
```
pip install mwparserfromhell datasets
```
Then, you can load any subset of Wikipedia per language and per date this way:
```python
from datasets import load_dataset
load_dataset("neuml/wikipedia", language="en", date="20240101")
```
You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html).
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias).
## Dataset Structure
### Data Instances
An example looks as follows:
```
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
```
### Data Fields
The data fields are the same among all configurations:
- `id` (`str`): ID of the article.
- `url` (`str`): URL of the article.
- `title` (`str`): Title of the article.
- `text` (`str`): Text content of the article.
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text.
### Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
|
aneeshas/imsdb-500tokensci-fi-movie-scripts | ---
dataset_info:
features:
- name: Sci-Fi
dtype: string
splits:
- name: train
num_bytes: 82670
num_examples: 180
download_size: 53226
dataset_size: 82670
---
# Dataset Card for "imsdb-500tokensci-fi-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sasha/prof_images_blip__Lykon-DreamShaper | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: courier
num_bytes: 3220269.0
num_examples: 100
- name: aide
num_bytes: 3472385.0
num_examples: 100
- name: police_officer
num_bytes: 2971579.0
num_examples: 100
- name: purchasing_agent
num_bytes: 3706168.0
num_examples: 100
- name: metal_worker
num_bytes: 4300217.0
num_examples: 100
- name: financial_analyst
num_bytes: 3730273.0
num_examples: 100
- name: stocker
num_bytes: 3002092.0
num_examples: 100
- name: it_specialist
num_bytes: 3849162.0
num_examples: 100
- name: writer
num_bytes: 3815757.0
num_examples: 100
- name: accountant
num_bytes: 3301253.0
num_examples: 100
- name: coach
num_bytes: 3364291.0
num_examples: 100
- name: painter
num_bytes: 3587432.0
num_examples: 100
- name: real_estate_broker
num_bytes: 3143465.0
num_examples: 100
- name: truck_driver
num_bytes: 4168681.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 3810901.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3768802.0
num_examples: 100
- name: cook
num_bytes: 3783118.0
num_examples: 100
- name: interior_designer
num_bytes: 3929319.0
num_examples: 100
- name: nutritionist
num_bytes: 3866238.0
num_examples: 100
- name: designer
num_bytes: 3360493.0
num_examples: 100
- name: maid
num_bytes: 3269062.0
num_examples: 100
- name: producer
num_bytes: 4011654.0
num_examples: 100
- name: executive_assistant
num_bytes: 3109178.0
num_examples: 100
- name: logistician
num_bytes: 3905564.0
num_examples: 100
- name: tractor_operator
num_bytes: 5188801.0
num_examples: 100
- name: doctor
num_bytes: 3038762.0
num_examples: 100
- name: inventory_clerk
num_bytes: 3902424.0
num_examples: 100
- name: sheet_metal_worker
num_bytes: 4046848.0
num_examples: 100
- name: groundskeeper
num_bytes: 3526805.0
num_examples: 100
- name: electrical_engineer
num_bytes: 5068341.0
num_examples: 100
- name: physical_therapist
num_bytes: 2872364.0
num_examples: 100
- name: insurance_agent
num_bytes: 2964103.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 4889373.0
num_examples: 100
- name: psychologist
num_bytes: 2930630.0
num_examples: 100
- name: financial_advisor
num_bytes: 3101141.0
num_examples: 100
- name: printing_press_operator
num_bytes: 4325576.0
num_examples: 100
- name: architect
num_bytes: 3334524.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3116590.0
num_examples: 100
- name: artist
num_bytes: 3321552.0
num_examples: 100
- name: office_worker
num_bytes: 3392256.0
num_examples: 100
- name: ceo
num_bytes: 2725789.0
num_examples: 100
- name: taxi_driver
num_bytes: 4421050.0
num_examples: 100
- name: librarian
num_bytes: 3760714.0
num_examples: 100
- name: author
num_bytes: 3841657.0
num_examples: 100
- name: plumber
num_bytes: 3721155.0
num_examples: 100
- name: construction_worker
num_bytes: 3595787.0
num_examples: 100
- name: clergy
num_bytes: 3326689.0
num_examples: 100
- name: electrician
num_bytes: 4444433.0
num_examples: 100
- name: jailer
num_bytes: 4249238.0
num_examples: 100
- name: credit_counselor
num_bytes: 3340328.0
num_examples: 100
- name: scientist
num_bytes: 3763435.0
num_examples: 100
- name: drywall_installer
num_bytes: 3186332.0
num_examples: 100
- name: school_bus_driver
num_bytes: 4588003.0
num_examples: 100
- name: dental_assistant
num_bytes: 3135047.0
num_examples: 100
- name: fitness_instructor
num_bytes: 3356902.0
num_examples: 100
- name: detective
num_bytes: 2545399.0
num_examples: 100
- name: hairdresser
num_bytes: 3197788.0
num_examples: 100
- name: welder
num_bytes: 4549984.0
num_examples: 100
- name: pharmacy_technician
num_bytes: 4237065.0
num_examples: 100
- name: compliance_officer
num_bytes: 3241075.0
num_examples: 100
- name: singer
num_bytes: 3198810.0
num_examples: 100
- name: tutor
num_bytes: 3442962.0
num_examples: 100
- name: language_pathologist
num_bytes: 3238081.0
num_examples: 100
- name: medical_records_specialist
num_bytes: 3478698.0
num_examples: 100
- name: sales_manager
num_bytes: 2889842.0
num_examples: 100
- name: industrial_engineer
num_bytes: 4524725.0
num_examples: 100
- name: manager
num_bytes: 2976237.0
num_examples: 100
- name: mechanic
num_bytes: 3973394.0
num_examples: 100
- name: postal_worker
num_bytes: 3518223.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 4211576.0
num_examples: 100
- name: salesperson
num_bytes: 2955675.0
num_examples: 100
- name: office_clerk
num_bytes: 3633420.0
num_examples: 100
- name: claims_appraiser
num_bytes: 3668012.0
num_examples: 100
- name: security_guard
num_bytes: 2878171.0
num_examples: 100
- name: interviewer
num_bytes: 2842270.0
num_examples: 100
- name: dispatcher
num_bytes: 4311103.0
num_examples: 100
- name: lawyer
num_bytes: 2978106.0
num_examples: 100
- name: marketing_manager
num_bytes: 2898102.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3353667.0
num_examples: 100
- name: software_developer
num_bytes: 3080372.0
num_examples: 100
- name: mover
num_bytes: 3406522.0
num_examples: 100
- name: supervisor
num_bytes: 3256695.0
num_examples: 100
- name: paralegal
num_bytes: 3144149.0
num_examples: 100
- name: graphic_designer
num_bytes: 3779936.0
num_examples: 100
- name: dentist
num_bytes: 3051311.0
num_examples: 100
- name: roofer
num_bytes: 4510641.0
num_examples: 100
- name: public_relations_specialist
num_bytes: 3018253.0
num_examples: 100
- name: engineer
num_bytes: 4143278.0
num_examples: 100
- name: occupational_therapist
num_bytes: 3172574.0
num_examples: 100
- name: manicurist
num_bytes: 3014804.0
num_examples: 100
- name: cleaner
num_bytes: 2822728.0
num_examples: 100
- name: facilities_manager
num_bytes: 3233702.0
num_examples: 100
- name: repair_worker
num_bytes: 3945550.0
num_examples: 100
- name: cashier
num_bytes: 4015653.0
num_examples: 100
- name: baker
num_bytes: 3760855.0
num_examples: 100
- name: market_research_analyst
num_bytes: 3801266.0
num_examples: 100
- name: health_technician
num_bytes: 3208097.0
num_examples: 100
- name: veterinarian
num_bytes: 3218038.0
num_examples: 100
- name: underwriter
num_bytes: 2965985.0
num_examples: 100
- name: mechanical_engineer
num_bytes: 4864008.0
num_examples: 100
- name: janitor
num_bytes: 3256354.0
num_examples: 100
- name: pilot
num_bytes: 3849806.0
num_examples: 100
- name: therapist
num_bytes: 2913566.0
num_examples: 100
- name: director
num_bytes: 3015590.0
num_examples: 100
- name: wholesale_buyer
num_bytes: 4007741.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4078377.0
num_examples: 100
- name: butcher
num_bytes: 4473092.0
num_examples: 100
- name: machinery_mechanic
num_bytes: 4410538.0
num_examples: 100
- name: event_planner
num_bytes: 3416510.0
num_examples: 100
- name: carpet_installer
num_bytes: 4231786.0
num_examples: 100
- name: musician
num_bytes: 3496741.0
num_examples: 100
- name: civil_engineer
num_bytes: 3887933.0
num_examples: 100
- name: farmer
num_bytes: 4224326.0
num_examples: 100
- name: financial_manager
num_bytes: 3032824.0
num_examples: 100
- name: childcare_worker
num_bytes: 3723729.0
num_examples: 100
- name: clerk
num_bytes: 3603897.0
num_examples: 100
- name: machinist
num_bytes: 3776999.0
num_examples: 100
- name: firefighter
num_bytes: 4226861.0
num_examples: 100
- name: photographer
num_bytes: 3227910.0
num_examples: 100
- name: file_clerk
num_bytes: 4124578.0
num_examples: 100
- name: bus_driver
num_bytes: 4379280.0
num_examples: 100
- name: fast_food_worker
num_bytes: 3902204.0
num_examples: 100
- name: bartender
num_bytes: 4232353.0
num_examples: 100
- name: computer_programmer
num_bytes: 4013303.0
num_examples: 100
- name: pharmacist
num_bytes: 4163465.0
num_examples: 100
- name: nursing_assistant
num_bytes: 3232853.0
num_examples: 100
- name: career_counselor
num_bytes: 3402257.0
num_examples: 100
- name: mental_health_counselor
num_bytes: 2864853.0
num_examples: 100
- name: network_administrator
num_bytes: 4548591.0
num_examples: 100
- name: teacher
num_bytes: 3003287.0
num_examples: 100
- name: dishwasher
num_bytes: 4891231.0
num_examples: 100
- name: teller
num_bytes: 3044401.0
num_examples: 100
- name: teaching_assistant
num_bytes: 2980715.0
num_examples: 100
- name: payroll_clerk
num_bytes: 3659293.0
num_examples: 100
- name: laboratory_technician
num_bytes: 3821994.0
num_examples: 100
- name: social_assistant
num_bytes: 1642549.0
num_examples: 100
- name: radiologic_technician
num_bytes: 3606317.0
num_examples: 100
- name: social_worker
num_bytes: 3202655.0
num_examples: 100
- name: nurse
num_bytes: 3163177.0
num_examples: 100
- name: receptionist
num_bytes: 3232646.0
num_examples: 100
- name: carpenter
num_bytes: 4186317.0
num_examples: 100
- name: correctional_officer
num_bytes: 3250295.0
num_examples: 100
- name: community_manager
num_bytes: 2923881.0
num_examples: 100
- name: massage_therapist
num_bytes: 2775268.0
num_examples: 100
- name: head_cook
num_bytes: 3711054.0
num_examples: 100
- name: plane_mechanic
num_bytes: 4178003.0
num_examples: 100
download_size: 547079713
dataset_size: 524072204.0
---
# Dataset Card for "prof_images_blip__Lykon-DreamShaper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RayRuiboChen/Self-Filter-LLaVA-25K | ---
license: cc-by-4.0
---
## Dataset details
Instructions selected from [LLaVA-Instruct-150K](liuhaotian/LLaVA-Instruct-150K)
self_filter_25k_clip.json: filtered annotations under CLIP setting.
self_filter_25k_scores.json: filtered annotations with Scores setting.
difficulty_clip.json: difficulty score for each instruction under CLIP setting.
difficulty_scores.json: difficulty score for each instruction under Scores setting.
**Paper or resources for more information:** [https://github.com/RayRuiboChen/Self-Filter](https://github.com/RayRuiboChen/Self-Filter) |
open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B | ---
pretty_name: Evaluation run of eren23/dpo-binarized-NeuralTrix-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T20:45:49.015685](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B/blob/main/results_2024-02-11T20-45-49.015685.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6469364528234108,\n\
\ \"acc_stderr\": 0.032183894515300515,\n \"acc_norm\": 0.6464632195656521,\n\
\ \"acc_norm_stderr\": 0.03285550090176264,\n \"mc1\": 0.6364749082007344,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7906684401427805,\n\
\ \"mc2_stderr\": 0.013527182281452275\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.01343890918477876,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252425\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7118103963353913,\n\
\ \"acc_stderr\": 0.004519941716508364,\n \"acc_norm\": 0.8888667596096396,\n\
\ \"acc_norm_stderr\": 0.0031365472766898884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6364749082007344,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7906684401427805,\n\
\ \"mc2_stderr\": 0.013527182281452275\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.01014194452375004\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6800606520090978,\n \
\ \"acc_stderr\": 0.012848426555240761\n }\n}\n```"
repo_url: https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|arc:challenge|25_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|gsm8k|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hellaswag|10_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T20-45-49.015685.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- '**/details_harness|winogrande|5_2024-02-11T20-45-49.015685.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T20-45-49.015685.parquet'
- config_name: results
data_files:
- split: 2024_02_11T20_45_49.015685
path:
- results_2024-02-11T20-45-49.015685.parquet
- split: latest
path:
- results_2024-02-11T20-45-49.015685.parquet
---
# Dataset Card for Evaluation run of eren23/dpo-binarized-NeuralTrix-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T20:45:49.015685](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B/blob/main/results_2024-02-11T20-45-49.015685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6469364528234108,
"acc_stderr": 0.032183894515300515,
"acc_norm": 0.6464632195656521,
"acc_norm_stderr": 0.03285550090176264,
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7906684401427805,
"mc2_stderr": 0.013527182281452275
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.01343890918477876,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252425
},
"harness|hellaswag|10": {
"acc": 0.7118103963353913,
"acc_stderr": 0.004519941716508364,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.0031365472766898884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7906684401427805,
"mc2_stderr": 0.013527182281452275
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.01014194452375004
},
"harness|gsm8k|5": {
"acc": 0.6800606520090978,
"acc_stderr": 0.012848426555240761
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sezosan/arc_tr_s3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 50891
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "arc_tr_s3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
icon-it-tdtu/mtet | ---
dataset_info:
features:
- name: vi
dtype: string
- name: en
dtype: string
- name: loss
dtype: float64
splits:
- name: train
num_bytes: 188286058.1887368
num_examples: 1000000
download_size: 199988484
dataset_size: 188286058.1887368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mboth/medienVersorgen-200-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Bereitstellen
'1': Entsorgen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 79475.56393442623
num_examples: 403
- name: test
num_bytes: 14725
num_examples: 77
- name: valid
num_bytes: 14725
num_examples: 77
download_size: 47115
dataset_size: 108925.56393442623
---
# Dataset Card for "medienVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_239 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1500335068
num_examples: 292349
download_size: 1532898350
dataset_size: 1500335068
---
# Dataset Card for "chunk_239"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
merve/pokemon-ds-embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 121979613.0
num_examples: 833
download_size: 103390020
dataset_size: 121979613.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ptaszynski/PolishCyberbullyingDataset | ---
license: cc-by-4.0
language:
- pl
tags:
- cyberbullying
- hate-speech
pretty_name: PolishCyberbullyingDataset
---
# Expert-annotated dataset to study cyberbullying in Polish language
This the first publically available expert-annotated dataset containing annotations of cyberbullying and hate-speech in Polish language.
Please, read [the paper](https://www.mdpi.com/2306-5729/9/1/1) about the dataset for all necessary details.
## Model
The classification model which achieved the highest classification results for the dataset is also released under the following URL.
[Polbert-CB - Polish BERT trained for Automatic Cyberbullying Detection](https://huggingface.co/ptaszynski/bert-base-polish-cyberbullying)
## Citations
Whenever you use the dataset, please, cite it using the following citation to [the paper](https://www.mdpi.com/2306-5729/9/1/1).
```
@article{ptaszynski2023expert,
title={Expert-Annotated Dataset to Study Cyberbullying in Polish Language},
author={Ptaszynski, Michal and Pieciukiewicz, Agata and Dybala, Pawel and Skrzek, Pawel and Soliwoda, Kamil and Fortuna, Marcin and Leliwa, Gniewosz and Wroczynski, Michal},
journal={Data},
volume={9},
number={1},
pages={1},
year={2023},
publisher={MDPI}
}
```
## Licences
The dataset is licensed under [CC BY 4.0](http://creativecommons.org/licenses/by/4.0/), or Creative Commons Attribution 4.0 International License.
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a>
## Bundle
The whole bundle containing (1) the old version of the dataset, (2) current version of the dataset, as well as (3) the model trained on this dataset can be found on [Zenodo](https://zenodo.org/records/7188178).
## Author
Michal Ptaszynski - contact me on:
- Twitter: [@mich_ptaszynski](https://twitter.com/mich_ptaszynski)
- GitHub: [ptaszynski](https://github.com/ptaszynski)
- LinkedIn: [michalptaszynski](https://jp.linkedin.com/in/michalptaszynski)
- HuggingFace: [ptaszynski](https://huggingface.co/ptaszynski)
|
juancopi81/orca-math-word-problems-50010_60012 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 11166817
num_examples: 10002
download_size: 3913062
dataset_size: 11166817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_80_1713126457 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 285648
num_examples: 672
download_size: 149156
dataset_size: 285648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Odunope/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4205526
num_examples: 1000
download_size: 2246282
dataset_size: 4205526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pedrosale/test-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaitchup/ultrachat-100k-flattened | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 632072903
num_examples: 100000
- name: test
num_bytes: 32563073
num_examples: 5140
download_size: 330831956
dataset_size: 664635976
---
# Dataset Card for "ultrachat-100k-flattened"
A random sample of 100k dialogues from [stingning/ultrachat](https://huggingface.co/datasets/stingning/ultrachat).
The dialogues are flattened into one single sequence of dialogue turns where each turn is introduced by one of the following roles:
* Assistant
* User
This conversion and subsampling of ultrachat was made to facilitate and speed up training with HuggingFace's TRL.
|
HuggingFaceH4/grok-conversation-harmless2 | ---
dataset_info:
features:
- name: init_prompt
dtype: string
- name: init_response
dtype: string
- name: critic_prompt
dtype: string
- name: critic_response
dtype: string
- name: revision_prompt
dtype: string
- name: revision_response
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 77931081
num_examples: 21268
- name: train_prefs
num_bytes: 77863425
num_examples: 21269
- name: test_sft
num_bytes: 4236971
num_examples: 1156
- name: test_prefs
num_bytes: 4235042
num_examples: 1156
download_size: 66850108
dataset_size: 164266519
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
- split: test_sft
path: data/test_sft-*
- split: test_prefs
path: data/test_prefs-*
---
# Dataset Card for "cai-conversation-dev1705680551"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/karanokyoukai | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kara No Kyoukai
This is the image base of bangumi Kara no Kyoukai, we detected 20 characters, 1626 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 415 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 79 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 49 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 400 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 50 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 21 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 15 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 20 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 111 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 64 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 27 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 156 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 11 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 23 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 20 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 49 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 13 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 51 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
autoevaluate/autoeval-staging-eval-autoevaluate__xsum-sample-autoevaluate__xsum-sample-437a8a-17406355 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/xsum-sample
eval_info:
task: summarization
model: autoevaluate/summarization
metrics: []
dataset_name: autoevaluate/xsum-sample
dataset_config: autoevaluate--xsum-sample
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: autoevaluate/summarization
* Dataset: autoevaluate/xsum-sample
* Config: autoevaluate--xsum-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
noahkim/Kor_Jpn_Translation_Dataset | ---
annotations_creators:
- expert-generated
language_creators:
- other
language:
- kor
- jpn
license:
- mit
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- translation
task_ids:
- language-modeling
paperswithcode_id: null
pretty_name: Kor-Jpn-Translation
---
# Dataset Card for "Kor_Jpn_Translation_Dataset"
### Dataset Summary
AI-Hub에서 제공하는 한국어-일본어 번역 말뭉치 데이터(https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=127)를 사용하기 쉽게 정제했습니다.
- 제공처 : AI-Hub(https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=realm&dataSetSn=127)
- 제목 : 한국어-일본어 문화 분야 이중 말뭉치
- 구축분야 : 문화재/향토/K-Food, K-POP(한류)/대중문화_공연 콘텐츠, IT/컴퓨터/모바일, 금융/증시, 사회/노동/복지, 교육, 특허/기술, 자동차
- 구축량 : 150만 문장쌍
- 응용분야 : 언어모델, 자동번역
- 언어 : 원시어-한국어, 목적어-일본어
### Supported Tasks and Leaderboards
- Translation
### Languages
- Kor
- Jpan
## Dataset Structure
features:
- name: KOR
dtype: string
- name: JPN
dtype: string
splits:
- name: train
num_bytes: 294787449
num_examples: 840000
- name: val
num_bytes: 88406929
num_examples: 252000
- name: test
num_bytes: 37964427
num_examples: 108000
download_size: 289307354
dataset_size: 421158805
### Data Splits
splits:
- name: train
num_bytes: 294787449
num_examples: 840000
- name: val
num_bytes: 88406929
num_examples: 252000
- name: test
num_bytes: 37964427
num_examples: 108000
### Contributions
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shreedharsmurnal/cardataset | ---
license: mit
---
|
kcz358/lmms_eval_gpt_response | ---
dataset_info:
features:
- name: user
dtype: string
- name: gpt
dtype: string
- name: category
dtype: string
- name: model_args
dtype: string
- name: from_dataset
dtype: string
splits:
- name: answer_extraction
num_bytes: 18981538
num_examples: 9179
- name: scoring
num_bytes: 11518252
num_examples: 6870
- name: comparing
num_bytes: 7263736
num_examples: 1350
download_size: 10030712
dataset_size: 37763526
configs:
- config_name: default
data_files:
- split: answer_extraction
path: data/answer_extraction-*
- split: scoring
path: data/scoring-*
- split: comparing
path: data/comparing-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.