id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
ethz-spylab/rlhf_trojan_dataset | 2023-10-02T14:07:56.000Z | [
"language:en",
"region:us"
] | ethz-spylab | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 56295642
num_examples: 42537
download_size: 31345674
dataset_size: 56295642
extra_gated_prompt: >-
You acknowledge that generations in this dataset can be harmful. You agree not to use the data to
conduct experiments that cause harm to human subjects.
extra_gated_fields:
I agree to use this model ONLY within the competition: checkbox
language:
- en
---
--- |
baizhi002/pyvenv | 2023-10-06T01:17:47.000Z | [
"region:us"
] | baizhi002 | null | null | null | 0 | 0 | Entry not found |
Dloring1/Mini-4K-C4 | 2023-10-02T13:58:31.000Z | [
"region:us"
] | Dloring1 | null | null | null | 0 | 0 | Entry not found |
Dloring1/Mini-4K-RefinedWeb | 2023-10-02T14:07:25.000Z | [
"region:us"
] | Dloring1 | null | null | null | 0 | 0 | Entry not found |
alea31415/tag_filtering | 2023-10-02T14:27:01.000Z | [
"region:us"
] | alea31415 | null | null | null | 1 | 0 | Entry not found |
aghent/copiapoa-roboflow | 2023-10-02T14:24:43.000Z | [
"license:apache-2.0",
"region:us"
] | aghent | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Photolens/Open-Platypus-flattened-text | 2023-10-02T14:48:26.000Z | [
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | Photolens | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30955805
num_examples: 24926
download_size: 15268093
dataset_size: 30955805
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
language:
- en
size_categories:
- 10K<n<100K
--- |
atom-in-the-universe/bild-bf7ba9ef-f1ad-4d01-90ad-197bec6c1c2c | 2023-10-02T16:41:30.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
TKNodven/mordredvoz | 2023-10-02T14:56:48.000Z | [
"license:openrail",
"region:us"
] | TKNodven | null | null | null | 0 | 0 | ---
license: openrail
---
|
goendalf666/sales-conversations-instruction-base | 2023-10-04T20:44:33.000Z | [
"arxiv:2306.11644",
"region:us"
] | goendalf666 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 28036745
num_examples: 20940
download_size: 4782593
dataset_size: 28036745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sales-conversations-instruction"
Modification of https://huggingface.co/datasets/goendalf666/sales-conversations-2
The following script was used to transform the sales-conversations-2 dataset to this instruction based dataset:
See the main model or github for more information
salesGPT_v2: https://huggingface.co/goendalf666/salesGPT_v2
github: https://github.com/tom813/salesGPT_foundation
This dataset was created for the purpose of training a sales agent chatbot that can convince people.
The initial idea came from: textbooks is all you need https://arxiv.org/abs/2306.11644
gpt-3.5-turbo was used for the generation
# Structure
The conversations have a customer and a salesman which appear always in changing order. customer, salesman, customer, salesman, etc.
The customer always starts the conversation
Who ends the conversation is not defined.
# Generation
Note that a textbook dataset is mandatory for this conversation generation. This examples rely on the following textbook dataset:
https://huggingface.co/datasets/goendalf666/sales-textbook_for_convincing_and_selling
The data generation code can be found here: https://github.com/tom813/salesGPT_foundation/blob/main/data_generation/conversation2conversation_instruction.py
```
import pandas as pd
from datasets import load_dataset, Dataset
data = load_dataset("goendalf666/sales-conversations-2", split="train")
df = data.to_pandas()
df_dict = df.to_dict(orient='list')
df = df.fillna('')
conversations = []
for i in df.iterrows():
current_conversation = ""
try:
for j in i[1]:
if "Customer:" in j:
current_conversation += j + " "
elif "Salesman:" in j:
prompt = f"""You are a in the role of a Salesman. Here is a conversation:
{current_conversation}
Answer as a Salesman to the previous Statement to convince the person to buy the product or service.
{j}"""
conversations.append(prompt)
current_conversation += j + " "
else:
break
except Exception as e:
print(e)
print(len(conversations))
df = pd.DataFrame(conversations)
ds = Dataset.from_pandas(df)
ds.push_to_hub("goendalf666/sales-conversations-instruction")
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hedinovianto/alifai | 2023-10-02T15:05:22.000Z | [
"region:us"
] | Hedinovianto | null | null | null | 0 | 0 | Entry not found |
BangumiBase/imoutosaeirebaii | 2023-10-02T15:53:08.000Z | [
"size_categories:n<1K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Imouto Sae Ireba Ii
This is the image base of bangumi Imouto sae Ireba Ii, we detected 18 characters, 622 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 30 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 88 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 7 | [Download](2/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 3 | 36 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 179 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 28 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 29 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 37 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 7 | [Download](8/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 9 | 6 | [Download](9/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 10 | 8 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 10 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 7 | [Download](12/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 13 | 10 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 15 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 14 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 69 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 42 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
antonio1206/hactiv_8 | 2023-10-02T15:38:23.000Z | [
"license:apache-2.0",
"region:us"
] | antonio1206 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Yahir/edits | 2023-10-02T16:17:45.000Z | [
"license:apache-2.0",
"region:us"
] | Yahir | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
zen-E/NEWS5M-simcse-roberta-large-embeddings-pca-256 | 2023-10-03T03:03:45.000Z | [
"task_categories:sentence-similarity",
"size_categories:1M<n<10M",
"language:en",
"region:us"
] | zen-E | null | null | null | 0 | 0 | ---
task_categories:
- sentence-similarity
language:
- en
size_categories:
- 1M<n<10M
---
A dataset that contains all data in 'ffgcc/NEWS5M' which the corresponding text embedding produced by 'princeton-nlp/unsup-simcse-roberta-large'. The features are transformed to a size of 256 by PCA.
The usage:
```python
news5M_kd_pca_dataset_unsup = torch.load('./NEWS5M-simcse-roberta-large-embeddings-pca-256/news5M_kd_pca_dataset_unsup.pt')
``` |
olanigan/lp_audio_text | 2023-10-02T16:31:20.000Z | [
"region:us"
] | olanigan | null | null | null | 0 | 0 | Entry not found |
Alterneko/n | 2023-10-03T03:41:03.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
tofighi/LLM | 2023-10-02T22:03:21.000Z | [
"region:us"
] | tofighi | null | null | null | 0 | 0 | Entry not found |
oblivisheee/ayase-saki-dataset | 2023-10-02T17:49:40.000Z | [
"license:creativeml-openrail-m",
"art",
"region:us"
] | oblivisheee | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
tags:
- art
---
<i>Idk how to publish dataset correct</i>
So, i published that dataset for public, because... idk for what, just like that.
Dataset contain 49 images and 49 tags, you could download it via zip file. |
Alignment-Lab-AI/caption_creation_0.6 | 2023-10-02T17:26:09.000Z | [
"region:us"
] | Alignment-Lab-AI | null | null | null | 0 | 0 | Entry not found |
anonimoh656r7r65/diss_gacha | 2023-10-02T17:42:48.000Z | [
"license:openrail",
"region:us"
] | anonimoh656r7r65 | null | null | null | 0 | 0 | ---
license: openrail
---
|
nairaxo/shikomori-asr | 2023-10-02T18:19:28.000Z | [
"region:us"
] | nairaxo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: path
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 375585328.0
num_examples: 787
download_size: 373013374
dataset_size: 375585328.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shikomori-asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yxu/LiME_data | 2023-10-02T18:27:08.000Z | [
"license:apache-2.0",
"region:us"
] | yxu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
ItsKazzle/baller-training-data | 2023-10-02T18:33:36.000Z | [
"license:gpl-3.0",
"region:us"
] | ItsKazzle | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
Eu001/Spok | 2023-10-02T19:37:09.000Z | [
"license:openrail",
"region:us"
] | Eu001 | null | null | null | 0 | 0 | ---
license: openrail
---
|
kira/rayquaza-big | 2023-10-02T20:47:46.000Z | [
"region:us"
] | kira | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: conversation
list:
- name: from
dtype: string
- name: value
dtype: string
- name: sys_message
dtype: string
- name: tkn_len
dtype: int64
splits:
- name: train
num_bytes: 3493078029.54713
num_examples: 993983
download_size: 1710059593
dataset_size: 3493078029.54713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rayquaza-big"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
md-nishat-008/Code-Mixed-Sentiment-Analysis-Dataset | 2023-10-02T21:27:24.000Z | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | md-nishat-008 | null | null | null | 0 | 0 | ---
license: cc-by-nc-nd-4.0
---
### Dataset Generation:
Initially, we select the Amazon Review Dataset as our base data, referenced from Ni et al. (2019)[^1]. We randomly extract 100,000 instances from this dataset. The original labels in this dataset are ratings, scaled from 1 to 5. For our specific task, we categorize them into Positive (rating > 3), Neutral (rating = 3), and Negative (rating < 3), ensuring a balanced number of instances for each label. To generate the synthetic Code-mixed dataset, we apply two distinct methodologies: the Random Code-mixing Algorithm by Krishnan et al. (2021)[^2] and r-CM by Santy et al. (2021)[^3].
### Class Distribution:
#### For train.csv:
| Label | Count | Percentage |
|----------|-------|------------|
| Negative | 20000 | 33.33% |
| Neutral | 20000 | 33.33% |
| Positive | 19999 | 33.33% |
#### For dev.csv:
| Label | Count | Percentage |
|----------|-------|------------|
| Neutral | 6667 | 33.34% |
| Positive | 6667 | 33.34% |
| Negative | 6666 | 33.33% |
#### For test.csv:
| Label | Count | Percentage |
|----------|-------|------------|
| Negative | 6667 | 33.34% |
| Positive | 6667 | 33.34% |
| Neutral | 6666 | 33.33% |
### Cite our Paper:
If you utilize this dataset, kindly cite our paper.
```bibtex
@article{raihan2023mixed,
title={Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi},
author={Raihan, Md Nishat and Goswami, Dhiman and Mahmud, Antara},
journal={arXiv preprint arXiv:2309.10272},
year={2023}
}
```
### References
[^1]: Ni, J., Li, J., & McAuley, J. (2019). Justifying recommendations using distantly-labeled reviews and fine-grained aspects. In Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP) (pp. 188-197).
[^2]: Krishnan, J., Anastasopoulos, A., Purohit, H., & Rangwala, H. (2021). Multilingual code-switching for zero-shot cross-lingual intent prediction and slot filling. arXiv preprint arXiv:2103.07792.
[^3]: Santy, S., Srinivasan, A., & Choudhury, M. (2021). BERTologiCoMix: How does code-mixing interact with multilingual BERT? In Proceedings of the Second Workshop on Domain Adaptation for NLP (pp. 111-121).
--- |
tofighi/PersianQA | 2023-10-02T21:34:29.000Z | [
"region:us"
] | tofighi | null | null | null | 0 | 0 | Entry not found |
aaron34x/whisper-es | 2023-10-02T21:45:52.000Z | [
"region:us"
] | aaron34x | null | null | null | 0 | 0 | Entry not found |
goendalf666/sales-conversations-instruction-customer | 2023-10-02T21:59:35.000Z | [
"region:us"
] | goendalf666 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 21867656
num_examples: 20927
download_size: 3900514
dataset_size: 21867656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sales-conversations-instruction-customer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
md-nishat-008/Code-Mixed-Offensive-Language-Detection-Dataset | 2023-10-02T22:05:01.000Z | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | md-nishat-008 | null | null | null | 0 | 0 | ---
license: cc-by-nc-nd-4.0
---
# Code-Mixed-Offensive-Language-Identification
This is a dataset for the offensive language detection task. It contains 100k code mixed data. The languages are Bangla-English-Hindi.
### Dataset Generation:
Initially, the labelling schema of OLID[^1] and SOLID[^2] serves as the seed data, from which we randomly select 100,000 data instances. The labels in this dataset are categorized as Non-Offensive and Offensive for the purpose of our task. We meticulously ensure an equal number of instances for both Non-Offensive and Offensive labels. To synthesize the Code-mixed dataset, we employ two distinct methodologies: the *Random Code-mixing Algorithm* by Krishnan et al. (2021)[^3] and *r-CM* by Santy et al. (2021)[^4].
### Class Distribution:
#### For train.csv:
| Label | Count | Percentage |
|-------|-------|------------|
| NOT | 40018 | 66.70% |
| OFF | 19982 | 33.30% |
#### For dev.csv:
| Label | Count | Percentage |
|-------|-------|------------|
| NOT | 13339 | 66.70% |
| OFF | 6661 | 33.30% |
#### For test.csv:
| Label | Count | Percentage |
|-------|-------|------------|
| NOT | 13340 | 66.70% |
| OFF | 6660 | 33.30% |
### Cite our Paper:
If you utilize this dataset, please cite our paper.
```bibtex
@article{raihan2023mixed,
title={Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi},
author={Raihan, Md Nishat and Goswami, Dhiman and Mahmud, Antara},
journal={arXiv preprint arXiv:2309.10272},
year={2023}
}
```
### References
[^1]: Zampieri, M., Malmasi, S., Nakov, P., Rosenthal, S., Farra, N., & Kumar, R. (2019). SemEval-2019 Task 6: Identifying and Categorizing Offensive Language in Social Media (OffensEval). In Proceedings of the 13th International Workshop on Semantic Evaluation (pp. 75–86). [https://aclanthology.org/S19-2010](https://aclanthology.org/S19-2010)
[^2]: Rosenthal, S., Atanasova, P., Karadzhov, G., Zampieri, M., & Nakov, P. (2021). SOLID: A Large-Scale Semi-Supervised Dataset for Offensive Language Identification. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 915–928). [https://aclanthology.org/2021.findings-acl.80](https://aclanthology.org/2021.findings-acl.80)
[^3]: Krishnan, J., Anastasopoulos, A., Purohit, H., & Rangwala, H. (2021). Multilingual code-switching for zero-shot cross-lingual intent prediction and slot filling. arXiv preprint arXiv:2103.07792.
[^4]: Santy, S., Srinivasan, A., & Choudhury, M. (2021). BERTologiCoMix: How does code-mixing interact with multilingual BERT? In Proceedings of the Second Workshop on Domain Adaptation for NLP (pp. 111–121).
---
|
TanmaySah/aug | 2023-10-03T00:27:48.000Z | [
"region:us"
] | TanmaySah | null | null | null | 0 | 0 | Entry not found |
toninhodjj/pipoka | 2023-10-02T22:31:25.000Z | [
"license:openrail",
"region:us"
] | toninhodjj | null | null | null | 0 | 0 | ---
license: openrail
---
|
marasama/nva-sirakawakomine | 2023-10-02T22:50:35.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
iwillreturnbatman/faith-connors | 2023-10-02T23:40:56.000Z | [
"license:apache-2.0",
"region:us"
] | iwillreturnbatman | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Milkaa/JacksonHwang | 2023-10-02T23:51:47.000Z | [
"license:unknown",
"region:us"
] | Milkaa | null | null | null | 0 | 0 | ---
license: unknown
---
|
inesgoddi/generated-test-dataset | 2023-10-02T23:59:55.000Z | [
"region:us"
] | inesgoddi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5309
num_examples: 10
download_size: 7080
dataset_size: 5309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generated-test-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hottiesnhotties/lora | 2023-10-03T00:15:05.000Z | [
"region:us"
] | hottiesnhotties | null | null | null | 0 | 0 | Entry not found |
marasama/nva-tachibanayama | 2023-10-03T00:37:49.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
Roscall/jessazaragoza-rvc | 2023-10-10T20:09:55.000Z | [
"rvc",
"region:us"
] | Roscall | null | null | null | 0 | 0 | ---
tags:
- rvc
--- |
samuelshapley/customer-test | 2023-10-03T01:01:37.000Z | [
"region:us"
] | samuelshapley | null | null | null | 0 | 0 | Entry not found |
luulinh90s/chm-corr-prj-giang | 2023-10-06T19:36:56.000Z | [
"license:mit",
"region:us"
] | luulinh90s | null | null | null | 0 | 0 | ---
license: mit
---
|
BangumiBase/toradora | 2023-10-03T03:21:53.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Toradora!
This is the image base of bangumi Toradora!, we detected 33 characters, 3929 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1527 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 45 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 26 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 27 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 73 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 83 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 31 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 16 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 67 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 313 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 49 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 22 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 19 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 36 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 34 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 54 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 53 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 780 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 19 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 21 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 31 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 11 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 14 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 13 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 10 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 14 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 212 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 7 | [Download](28/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 29 | 16 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 15 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 14 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 267 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/macrossdelta | 2023-10-03T03:17:31.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Macross Delta
This is the image base of bangumi Macross Delta, we detected 45 characters, 4504 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 33 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 43 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 14 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 16 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 170 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 12 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 13 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 55 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 52 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 93 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 33 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 131 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 17 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 13 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 147 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 187 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 657 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 11 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 65 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 31 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 41 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 26 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 275 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 276 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 156 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 16 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 9 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 9 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 9 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 208 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 22 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 18 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 96 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 14 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 596 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 58 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 28 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 170 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 6 | [Download](38/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 39 | 8 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 6 | [Download](40/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 41 | 180 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 30 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 6 | [Download](43/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 448 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
js282979/kepler_62f | 2023-10-03T01:45:03.000Z | [
"region:us"
] | js282979 | null | null | null | 0 | 0 | Entry not found |
SmithAI/dataset | 2023-10-03T17:36:20.000Z | [
"region:us"
] | SmithAI | null | null | null | 0 | 0 | Entry not found |
BangumiBase/akamegakill | 2023-10-03T03:19:06.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Akame Ga Kill!
This is the image base of bangumi Akame ga Kill!, we detected 40 characters, 2411 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 441 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 121 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 40 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 97 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 258 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 6 | [Download](5/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 6 | 16 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 56 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 118 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 7 | [Download](9/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 10 | 38 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 11 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 33 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 30 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 18 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 17 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 142 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 23 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 43 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 20 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 26 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 102 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 34 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 22 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 13 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 117 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 7 | [Download](28/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 29 | 8 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 20 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 115 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 42 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 44 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 15 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 13 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 9 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 5 | [Download](37/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 38 | 13 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 251 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Eduardo098/Phonegiy | 2023-10-03T02:45:21.000Z | [
"license:apache-2.0",
"region:us"
] | Eduardo098 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
BaorBaor/99k_data_for_multichoice | 2023-10-03T02:24:12.000Z | [
"region:us"
] | BaorBaor | null | null | null | 0 | 0 | Entry not found |
Ethan615/guanaco-llama2-1k | 2023-10-03T02:49:10.000Z | [
"region:us"
] | Ethan615 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chunpingvi/dataset_tone1 | 2023-10-04T01:47:10.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
Ckz03/BELT2_data | 2023-10-03T03:05:21.000Z | [
"region:us"
] | Ckz03 | null | null | null | 0 | 0 | Entry not found |
Eduardo098/modelos | 2023-10-03T02:58:42.000Z | [
"license:apache-2.0",
"region:us"
] | Eduardo098 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
chilge/1212 | 2023-10-03T03:08:59.000Z | [
"region:us"
] | chilge | null | null | null | 0 | 0 | Entry not found |
mooklife/finetune | 2023-10-03T03:34:53.000Z | [
"region:us"
] | mooklife | null | null | null | 0 | 0 | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
|
ferdIF/ferd-dataset-v3 | 2023-10-03T03:18:52.000Z | [
"region:us"
] | ferdIF | null | null | null | 0 | 0 | Entry not found |
BangumiBase/joshikouseinomudazukai | 2023-10-03T04:21:07.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Joshikousei No Mudazukai
This is the image base of bangumi Joshikousei no Mudazukai, we detected 23 characters, 1598 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 202 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 99 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 11 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 19 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 41 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 74 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 271 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 10 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 7 | [Download](9/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 10 | 11 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 190 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 33 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 79 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 12 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 110 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 14 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 86 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 147 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 6 | [Download](19/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 20 | 5 | [Download](20/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 21 | 6 | [Download](21/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 143 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
nathanaw/cyber-threat-intelligence-stix | 2023-10-03T03:24:49.000Z | [
"region:us"
] | nathanaw | null | null | null | 1 | 0 | Entry not found |
hottiesnhotties/lorahrm | 2023-10-03T03:35:26.000Z | [
"region:us"
] | hottiesnhotties | null | null | null | 0 | 0 | |
Fernandoefg/Spanish_Short_Stories | 2023-10-03T08:30:54.000Z | [
"task_categories:text-classification",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:es",
"license:gpl-3.0",
"region:us"
] | Fernandoefg | null | null | null | 0 | 0 | ---
license: gpl-3.0
task_categories:
- text-classification
- text-generation
language:
- es
pretty_name: Spanish Short Stories Dataset
size_categories:
- 1K<n<10K
--- |
HerlambangHaryo/job_title | 2023-10-03T04:26:27.000Z | [
"region:us"
] | HerlambangHaryo | null | null | null | 0 | 0 | Entry not found |
BangumiBase/demonslayer | 2023-10-03T08:11:22.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Demon Slayer
This is the image base of bangumi Demon Slayer, we detected 78 characters, 5890 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 256 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 42 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 305 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 10 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 31 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 23 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 50 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 1991 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 82 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 192 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 72 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 87 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 43 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 61 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 53 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 34 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 58 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 32 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 56 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 48 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 32 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 37 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 48 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 186 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 47 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 23 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 94 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 37 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 28 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 24 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 46 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 35 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 105 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 22 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 17 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 37 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 17 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 12 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 25 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 14 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 18 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 92 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 77 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 16 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 44 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 30 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 16 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 73 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 149 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 17 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 34 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 13 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 31 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 8 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 165 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 53 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 19 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 24 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 20 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 15 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 18 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 18 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 19 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 33 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 13 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 16 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 5 | [Download](66/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 67 | 22 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 15 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 24 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 6 | [Download](70/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 71 | 12 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 10 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 10 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 27 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 6 | [Download](75/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 76 | 103 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 207 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
dakadkart/coact2013 | 2023-10-03T05:07:33.000Z | [
"language:en",
"region:us"
] | dakadkart | null | null | null | 0 | 0 | ---
language:
- en
pretty_name: e
--- |
banghua/random_bac | 2023-10-03T04:54:44.000Z | [
"region:us"
] | banghua | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
sequence: string
- name: completions
sequence: string
splits:
- name: train
num_bytes: 545587063
num_examples: 92511
download_size: 236177873
dataset_size: 545587063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bactrian"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keisukefkkk/nva-gakikyonyu | 2023-10-03T04:58:22.000Z | [
"region:us"
] | keisukefkkk | null | null | null | 0 | 0 | Entry not found |
Yuripoke10/YuriAIpoke | 2023-10-03T05:32:35.000Z | [
"region:us"
] | Yuripoke10 | null | null | null | 0 | 0 | Entry not found |
imnotednamode/splats | 2023-10-03T06:41:11.000Z | [
"region:us"
] | imnotednamode | null | null | null | 0 | 0 | Entry not found |
joey234/sst2_affix | 2023-10-03T06:09:30.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: words_with_affixes
sequence: string
splits:
- name: validation
num_bytes: 22640
num_examples: 146
download_size: 19044
dataset_size: 22640
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "sst2_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/imdb_affix | 2023-10-03T06:18:42.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
sequence: string
splits:
- name: test
num_bytes: 23643683
num_examples: 14357
download_size: 14856265
dataset_size: 23643683
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "imdb_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/rotten_tomatoes_affix | 2023-10-03T06:38:46.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
sequence: string
splits:
- name: test
num_bytes: 32292
num_examples: 194
download_size: 24662
dataset_size: 32292
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "rotten_tomatoes_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/tweet_eval_affix | 2023-10-03T06:42:22.000Z | [
"region:us"
] | joey234 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
- name: words_with_affixes
sequence: string
splits:
- name: test
num_bytes: 137916
num_examples: 1060
download_size: 95675
dataset_size: 137916
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "tweet_eval_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GirlKingAlex/Initial-training | 2023-10-03T06:46:25.000Z | [
"region:us"
] | GirlKingAlex | null | null | null | 0 | 0 | Entry not found |
Arsive/toxicity_classification_jigsaw | 2023-10-03T12:51:28.000Z | [
"task_categories:text-classification",
"size_categories:1K<n<200K",
"language:en",
"license:apache-2.0",
"region:us"
] | Arsive | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<200K
---
### Dataset info
#### Training Dataset:
You are provided with a large number of Wikipedia comments which have been labeled by human raters for toxic behavior. The types of toxicity are:
- toxic
- severe_toxic
- obscene
- threat
- insult
- identity_hate
The original dataset can be found here: [jigsaw_toxic_classification](https://www.kaggle.com/competitions/jigsaw-toxic-comment-classification-challenge/data)
Our training dataset is a sampled version from the original dataset, <b>containing equal number of samples for both clean and toxic classes. </b><br>
#### Dataset creation:
<code><pre>data = pd.read_csv('train.csv') # train.csv from the original dataset
column_names = ['toxic', 'severe_toxic', 'obscene', 'threat', 'insult', 'identity_hate']
column_labels = data[column_names][2:-1]
train_toxic = data[data[column_names].sum(axis=1) > 0]
train_clean = data[data[column_names].sum(axis=1) == 0]
train_clean_sampled = train_clean.sample(n=16225, random_state=42)
dataframe = pd.concat([train_toxic, train_clean_sampled], axis=0)
dataframe = dataframe.sample(frac=1, random_state=42)
dataset = Dataset.from_pandas(dataframe)
train_dataset = dataset.train_test_split(test_size=0.2)['train']
val_dataset = dataset.train_test_split(test_size=0.2)['test']</pre></code>
### Caution:
This dataset contains comments that are toxic in nature. Kindly use appropriately.
### Citation
<pre>
@misc{jigsaw-toxic-comment-classification-challenge,
author = {cjadams, Jeffrey Sorensen, Julia Elliott, Lucas Dixon, Mark McDonald, nithum, Will Cukierski},
title = {Toxic Comment Classification Challenge},
publisher = {Kaggle},
year = {2017},
url = {https://kaggle.com/competitions/jigsaw-toxic-comment-classification-challenge}
}</pre>
|
aistrova/CMAD | 2023-10-05T18:19:10.000Z | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | aistrova | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
Safesearch V5, which uses our innovative EfficientNetV2.5 architecture, will be released soon, along with the benchmark CSV file containing all image URLs, Google Safesearch predictions, AIstrova Safesearch predictions, and true labels.
Please note that this benchmark (validation set) has been reviewed multiple times, using only commonly accepted definitions of safe and unsafe content to minimize bias. However, it may still contain a few image labels that are controversial.
Also keep in mind that the accuracy for Google Safesearch may not be exact, although we have done our best to reduce the chance of incorrect Google Safesearch labels using [this Google Image search method](./google_image.js).
*We are training the model again with different sequences of input dimensions during training to improve the model's ability to generalize. The table below only shows the best result so far as of Sept 30th, 2023.*
| Model Name | Benchmark Subset | Accuracy | Test Samples Directly from Google Images | Challenge |
|---------------------|----------------------------------------------|--------------|------------------------------------------|--------------------------------------------------|
| AIstrova Safesearch V5 | Clothing (hentai vs safe waifu) | **88.755%** | 249 | Ability to classify hentai vs. safe waifu content, even if it's on an unusual format like t-shirt prints |
| Google Safesearch |----------------------------------------------| 55.422% |------------------------------------------|--------------------------------------------------|
| AIstrova Safesearch V5 | Movie Scenes & Video Games (graphic vs safe content) | **90.179%** | 224 | Ability to understand the nuanced differences between small injuries, horror, gory, and graphic content |
| Google Safesearch |----------------------------------------------| 69.196% |------------------------------------------|--------------------------------------------------|
| AIstrova Safesearch V5 | African Girls (suggestive vs sexy) | **93.141%** | 277 | Ability to understand nuanced differences between sexy and sexually suggestive photos and make unbiased predictions, by training on a dataset with almost no African people |
| Google Safesearch |----------------------------------------------| 77.617% |------------------------------------------|--------------------------------------------------|
| AIstrova Safesearch V5 | Drawings (nudity vs safe) | **90.217%** | 184 | Ability to generalize on artworks with less than 100 artworks in the training data |
| Google Safesearch |----------------------------------------------| 79.891% |------------------------------------------|--------------------------------------------------|
We built our state-of-the-art dataset and model architecture with the hope of beating the average image recognition accuracy of adult human experts using a less than 20M param model.
However, after 32 days of research and over 26 days of GPU hours, this is the closest we could get: about 90% accuracy on sensitive topics as shown in the table, about 90% accuracy on extreme challenges as shown in the benchmark, and 99% accuracy on regular & 300+ edge cases combined as shown in our training accuracy & F1-score. |
yagnikposhiya/CommonVoiceCorpusUrdu15 | 2023-10-03T09:25:47.000Z | [
"license:apache-2.0",
"region:us"
] | yagnikposhiya | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
nguyenthanhdo/viettel_v3.1 | 2023-10-03T07:52:39.000Z | [
"region:us"
] | nguyenthanhdo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: translated
dtype: bool
- name: output_len
dtype: int64
- name: source
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 314243226.0
num_examples: 90000
download_size: 151381354
dataset_size: 314243226.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "viettel_v3.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
riltomagola19/ummatest | 2023-10-03T08:02:31.000Z | [
"region:us"
] | riltomagola19 | null | null | null | 0 | 0 | Entry not found |
AlignmentLab-AI/caption_creation_0.5 | 2023-10-03T08:18:41.000Z | [
"region:us"
] | AlignmentLab-AI | null | null | null | 0 | 0 | Entry not found |
umarigan/turkish_wiki | 2023-10-03T08:41:23.000Z | [
"region:us"
] | umarigan | null | null | null | 0 | 0 | Entry not found |
AIMH/SWMH | 2023-10-05T10:46:03.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | AIMH | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
We collect this dataset from some mental health-related subreddits in https://www.reddit.com/ to further the study of mental disorders and suicidal ideation. We name this dataset as Reddit SuicideWatch and Mental Health Collection, or SWMH for short, where discussions comprise suicide-related intention and mental disorders like depression, anxiety, and bipolar. We use the Reddit official API and develop a web spider to collect the targeted forums. This collection contains a total of 54,412 posts. Specific subreddits are listed in Table 4 of the below paper, as well as the number and the percentage of posts collected in the train-val-test split.
The dataset is also available on [Zenodo](https://doi.org/10.5281/zenodo.6476179). [](https://doi.org/10.5281/zenodo.6476179)
By accessing the dataset, you agree that:
1. User(s) will make no attempt to identify or contact individual participants from whom these Data were collected even though this dataset is anonymous;
2. User(s) will not distribute these data to any entity or individual beyond those specified in the approved Data Access Agreement;
3. User(s) will agree to use data only for research purposes;
4. User(s) will take all reasonable and customary measures to protect the confidential nature of materials, and avoid the disclosure or unauthorized use;
5. The data and any derivatives will be stored only on password-protected servers where access is restricted to the users using Unix group permissions;
The dataset is only for research purposes.
Please use your **institutional email** to request access.
If you use this dataset, please cite the paper as:
Ji, S., Li, X., Huang, Z. et al. Suicidal ideation and mental disorder detection with attentive relation networks. Neural Comput & Applic (2021). https://doi.org/10.1007/s00521-021-06208-y
```
@article{ji2021suicidal,
title={Suicidal ideation and mental disorder detection with attentive relation networks},
author={Ji, Shaoxiong and Li, Xue and Huang, Zi and Cambria, Erik},
journal={Neural Computing and Applications},
year={2021},
publisher={Springer}
}
``` |
TeraTTS/stress_dataset_sft_poetry | 2023-10-03T08:47:43.000Z | [
"license:mit",
"region:us"
] | TeraTTS | null | null | null | 0 | 0 | ---
license: mit
---
|
JojoPuppet/wikipedia_embeddings_6M | 2023-10-03T09:07:46.000Z | [
"region:us"
] | JojoPuppet | null | null | null | 0 | 0 | Entry not found |
kosmikakapo/instance_halos_data | 2023-10-03T09:02:25.000Z | [
"license:mit",
"region:us"
] | kosmikakapo | null | null | null | 0 | 0 | ---
license: mit
---
|
cs2/minimal_ds | 2023-10-03T09:03:20.000Z | [
"region:us"
] | cs2 | null | null | null | 0 | 0 | Entry not found |
we-r-ai/stunning | 2023-10-04T13:24:07.000Z | [
"task_categories:text-classification",
"size_categories:n<1K",
"license:apache-2.0",
"art",
"ai art",
"doi:10.57967/hf/1184",
"region:us"
] | we-r-ai | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-classification
tags:
- art
- ai art
pretty_name: nawdre mod
size_categories:
- n<1K
---
werdna696 |
AlignmentLab-AI/caption_creation_0.65 | 2023-10-06T02:43:50.000Z | [
"region:us"
] | AlignmentLab-AI | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-e6a33bb8-2600-41a4-8760-acabdbd1953d | 2023-10-03T10:01:52.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Nuser84/Jeverlyn_Lora | 2023-10-03T10:10:12.000Z | [
"region:us"
] | Nuser84 | null | null | null | 0 | 0 | Entry not found |
zivicmilos/llm-performance | 2023-10-03T11:26:28.000Z | [
"region:us"
] | zivicmilos | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for LLM Performance
### Dataset Summary
This table presents a comprehensive comparative analysis of a few popular LLMs, such as Falcon, LLama 2, and Mistral, highlighting both the quality of their outputs and the corresponding inference times. We finetuned the Falcon model with the full Alpaca dataset of 52k datapoints and with randomly sampled 5k datapoints and then compared them with base and instruct versions of Falcon, LLama 2 and Mistral. All models are with 7B parameters and in int4 representation.
|
open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.1 | 2023-10-03T10:22:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PY007/TinyLlama-1.1B-Chat-v0.1](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T10:21:28.182244](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.1/blob/main/results_2023-10-03T10-21-28.182244.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2697887991857443,\n\
\ \"acc_stderr\": 0.03204354169460123,\n \"acc_norm\": 0.2726433802005869,\n\
\ \"acc_norm_stderr\": 0.03205337292022538,\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.3903252380640832,\n\
\ \"mc2_stderr\": 0.014859134378682406\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2790102389078498,\n \"acc_stderr\": 0.013106784883601329,\n\
\ \"acc_norm\": 0.3199658703071672,\n \"acc_norm_stderr\": 0.013631345807016193\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4146584345747859,\n\
\ \"acc_stderr\": 0.004916561213591292,\n \"acc_norm\": 0.5421230830511851,\n\
\ \"acc_norm_stderr\": 0.00497204260200138\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.03550683989165581,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.03550683989165581\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\
\ \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.2838709677419355,\n\
\ \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055952,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055952\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.034474782864143586,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.034474782864143586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3357798165137615,\n \"acc_stderr\": 0.020248081396752934,\n \"\
acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.020248081396752934\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.029886910547626964,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.029886910547626964\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002159,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002159\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094631,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475743,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278134,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278134\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.02987257770889115,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.02987257770889115\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n\
\ \"acc_stderr\": 0.01556925469204578,\n \"acc_norm\": 0.2541507024265645,\n\
\ \"acc_norm_stderr\": 0.01556925469204578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258165,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258165\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808862,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808862\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22186495176848875,\n\
\ \"acc_stderr\": 0.023598858292863047,\n \"acc_norm\": 0.22186495176848875,\n\
\ \"acc_norm_stderr\": 0.023598858292863047\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178479,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178479\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.01784808957491322,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.01784808957491322\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.2835820895522388,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.3903252380640832,\n\
\ \"mc2_stderr\": 0.014859134378682406\n }\n}\n```"
repo_url: https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-21-28.182244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-21-28.182244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-21-28.182244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-21-28.182244.parquet'
- config_name: results
data_files:
- split: 2023_10_03T10_21_28.182244
path:
- results_2023-10-03T10-21-28.182244.parquet
- split: latest
path:
- results_2023-10-03T10-21-28.182244.parquet
---
# Dataset Card for Evaluation run of PY007/TinyLlama-1.1B-Chat-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PY007/TinyLlama-1.1B-Chat-v0.1](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T10:21:28.182244](https://huggingface.co/datasets/open-llm-leaderboard/details_PY007__TinyLlama-1.1B-Chat-v0.1/blob/main/results_2023-10-03T10-21-28.182244.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2697887991857443,
"acc_stderr": 0.03204354169460123,
"acc_norm": 0.2726433802005869,
"acc_norm_stderr": 0.03205337292022538,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.01505186948671501,
"mc2": 0.3903252380640832,
"mc2_stderr": 0.014859134378682406
},
"harness|arc:challenge|25": {
"acc": 0.2790102389078498,
"acc_stderr": 0.013106784883601329,
"acc_norm": 0.3199658703071672,
"acc_norm_stderr": 0.013631345807016193
},
"harness|hellaswag|10": {
"acc": 0.4146584345747859,
"acc_stderr": 0.004916561213591292,
"acc_norm": 0.5421230830511851,
"acc_norm_stderr": 0.00497204260200138
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.03550683989165581,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.03550683989165581
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055952,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055952
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.034474782864143586,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.034474782864143586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.020248081396752934,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.020248081396752934
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.029886910547626964,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.029886910547626964
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002159,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002159
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094631,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475743,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475743
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278134,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278134
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.02987257770889115,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.02987257770889115
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2541507024265645,
"acc_stderr": 0.01556925469204578,
"acc_norm": 0.2541507024265645,
"acc_norm_stderr": 0.01556925469204578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808862,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808862
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22186495176848875,
"acc_stderr": 0.023598858292863047,
"acc_norm": 0.22186495176848875,
"acc_norm_stderr": 0.023598858292863047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178479,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.01784808957491322,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.01784808957491322
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299194,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299194
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.01505186948671501,
"mc2": 0.3903252380640832,
"mc2_stderr": 0.014859134378682406
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-a123ddc3-16f1-484a-bc83-8ec6975eb538 | 2023-10-03T10:38:39.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-6cdbe82a-27df-4754-8455-ba091fba653a | 2023-10-03T10:43:30.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ | 2023-10-03T10:55:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-7b-Chat-AWQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-7b-Chat-AWQ](https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T10:54:21.847398](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ/blob/main/results_2023-10-03T10-54-21.847398.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24649856315672244,\n\
\ \"acc_stderr\": 0.03141071505730311,\n \"acc_norm\": 0.2472310481243819,\n\
\ \"acc_norm_stderr\": 0.031423123037027975,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.499548040569627,\n\
\ \"mc2_stderr\": 0.017139623909179967\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22866894197952217,\n \"acc_stderr\": 0.012272853582540799,\n\
\ \"acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423707\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2551284604660426,\n\
\ \"acc_stderr\": 0.004350424750646203,\n \"acc_norm\": 0.2548297151961761,\n\
\ \"acc_norm_stderr\": 0.004348748730529938\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610645,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610645\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23015873015873015,\n \"acc_stderr\": 0.021679219663693135,\n \"\
acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.021679219663693135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.02850137816789395,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.02850137816789395\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03588624800091709,\n\
\ \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03588624800091709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.03074630074212451,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.03074630074212451\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18134715025906736,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.18134715025906736,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02093244577446318,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02093244577446318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02626502460827589,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02626502460827589\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605607,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605607\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.20098039215686275,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.20098039215686275,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.02454761779480383,\n\
\ \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.02454761779480383\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409158,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409158\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"\
acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2627118644067797,\n\
\ \"acc_stderr\": 0.011240545514995674,\n \"acc_norm\": 0.2627118644067797,\n\
\ \"acc_norm_stderr\": 0.011240545514995674\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677048,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677048\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612379,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.499548040569627,\n\
\ \"mc2_stderr\": 0.017139623909179967\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-54-21.847398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-54-21.847398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-54-21.847398.parquet'
- config_name: results
data_files:
- split: 2023_10_03T10_54_21.847398
path:
- results_2023-10-03T10-54-21.847398.parquet
- split: latest
path:
- results_2023-10-03T10-54-21.847398.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-7b-Chat-AWQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-7b-Chat-AWQ](https://huggingface.co/TheBloke/Llama-2-7b-Chat-AWQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T10:54:21.847398](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7b-Chat-AWQ/blob/main/results_2023-10-03T10-54-21.847398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24649856315672244,
"acc_stderr": 0.03141071505730311,
"acc_norm": 0.2472310481243819,
"acc_norm_stderr": 0.031423123037027975,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.499548040569627,
"mc2_stderr": 0.017139623909179967
},
"harness|arc:challenge|25": {
"acc": 0.22866894197952217,
"acc_stderr": 0.012272853582540799,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423707
},
"harness|hellaswag|10": {
"acc": 0.2551284604660426,
"acc_stderr": 0.004350424750646203,
"acc_norm": 0.2548297151961761,
"acc_norm_stderr": 0.004348748730529938
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610645,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610645
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.021679219663693135,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.021679219663693135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.02850137816789395,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.02850137816789395
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03588624800091709,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03588624800091709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.03074630074212451,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.03074630074212451
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18134715025906736,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.18134715025906736,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02093244577446318,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02093244577446318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02626502460827589,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02626502460827589
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605607,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605607
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.20098039215686275,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.20098039215686275,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409158,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409158
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2627118644067797,
"acc_stderr": 0.011240545514995674,
"acc_norm": 0.2627118644067797,
"acc_norm_stderr": 0.011240545514995674
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677048,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677048
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612379,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.499548040569627,
"mc2_stderr": 0.017139623909179967
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-61395877-3217-414c-afdf-bfd5cedbb8fa | 2023-10-03T10:57:08.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-WVG-Uncensored | 2023-10-03T11:00:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LTC-AI-Labs/L2-7b-Base-WVG-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LTC-AI-Labs/L2-7b-Base-WVG-Uncensored](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-WVG-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-WVG-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T10:58:44.594405](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-WVG-Uncensored/blob/main/results_2023-10-03T10-58-44.594405.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46911107609262404,\n\
\ \"acc_stderr\": 0.03529369337772234,\n \"acc_norm\": 0.47308157821014335,\n\
\ \"acc_norm_stderr\": 0.03527884705608625,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.42592502213417693,\n\
\ \"mc2_stderr\": 0.014412365042501762\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956952,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995421\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5937064329814777,\n\
\ \"acc_stderr\": 0.0049013686295334225,\n \"acc_norm\": 0.7912766381198965,\n\
\ \"acc_norm_stderr\": 0.004055657006965432\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.030709486992556545,\n\
\ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.030709486992556545\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.46774193548387094,\n\
\ \"acc_stderr\": 0.02838474778881333,\n \"acc_norm\": 0.46774193548387094,\n\
\ \"acc_norm_stderr\": 0.02838474778881333\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380025,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5050505050505051,\n\
\ \"acc_stderr\": 0.035621707606254015,\n \"acc_norm\": 0.5050505050505051,\n\
\ \"acc_norm_stderr\": 0.035621707606254015\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6293577981651376,\n \"acc_stderr\": 0.02070745816435298,\n \"\
acc_norm\": 0.6293577981651376,\n \"acc_norm_stderr\": 0.02070745816435298\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.02971127586000536,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.02971127586000536\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"\
acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.033378837362550984,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.033378837362550984\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.03035152732334493,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.03035152732334493\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6500638569604087,\n\
\ \"acc_stderr\": 0.017055679797150426,\n \"acc_norm\": 0.6500638569604087,\n\
\ \"acc_norm_stderr\": 0.017055679797150426\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.02782021415859437,\n\
\ \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.02782021415859437\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281274,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281274\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002575,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002575\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428188,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428188\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4493464052287582,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.0314147080258659,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.0314147080258659\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.42592502213417693,\n\
\ \"mc2_stderr\": 0.014412365042501762\n }\n}\n```"
repo_url: https://huggingface.co/LTC-AI-Labs/L2-7b-Base-WVG-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-58-44.594405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T10-58-44.594405.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-58-44.594405.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T10-58-44.594405.parquet'
- config_name: results
data_files:
- split: 2023_10_03T10_58_44.594405
path:
- results_2023-10-03T10-58-44.594405.parquet
- split: latest
path:
- results_2023-10-03T10-58-44.594405.parquet
---
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Base-WVG-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Base-WVG-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Base-WVG-Uncensored](https://huggingface.co/LTC-AI-Labs/L2-7b-Base-WVG-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-WVG-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T10:58:44.594405](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Base-WVG-Uncensored/blob/main/results_2023-10-03T10-58-44.594405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46911107609262404,
"acc_stderr": 0.03529369337772234,
"acc_norm": 0.47308157821014335,
"acc_norm_stderr": 0.03527884705608625,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.42592502213417693,
"mc2_stderr": 0.014412365042501762
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956952,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995421
},
"harness|hellaswag|10": {
"acc": 0.5937064329814777,
"acc_stderr": 0.0049013686295334225,
"acc_norm": 0.7912766381198965,
"acc_norm_stderr": 0.004055657006965432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.46774193548387094,
"acc_stderr": 0.02838474778881333,
"acc_norm": 0.46774193548387094,
"acc_norm_stderr": 0.02838474778881333
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6293577981651376,
"acc_stderr": 0.02070745816435298,
"acc_norm": 0.6293577981651376,
"acc_norm_stderr": 0.02070745816435298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.02971127586000536,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.02971127586000536
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.033378837362550984,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.033378837362550984
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.03035152732334493,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.03035152732334493
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6500638569604087,
"acc_stderr": 0.017055679797150426,
"acc_norm": 0.6500638569604087,
"acc_norm_stderr": 0.017055679797150426
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.02782021415859437,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.02782021415859437
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281274,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281274
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002575,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428188,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428188
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.0314147080258659,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.0314147080258659
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.42592502213417693,
"mc2_stderr": 0.014412365042501762
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-7b0769c2-32a2-4f68-9a2e-5758e7f3c99e | 2023-10-03T11:02:21.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t | 2023-10-03T11:08:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of stabilityai/stablelm-3b-4e1t
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T11:07:20.615284](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t/blob/main/results_2023-10-03T11-07-20.615284.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4534844875596275,\n\
\ \"acc_stderr\": 0.035223600817914945,\n \"acc_norm\": 0.457694087853883,\n\
\ \"acc_norm_stderr\": 0.03521504058842905,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.37196774260485427,\n\
\ \"mc2_stderr\": 0.013504256751536046\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.014397070564409172,\n\
\ \"acc_norm\": 0.4658703071672355,\n \"acc_norm_stderr\": 0.014577311315231104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5622385978888668,\n\
\ \"acc_stderr\": 0.00495097323118874,\n \"acc_norm\": 0.7594104760007967,\n\
\ \"acc_norm_stderr\": 0.004265678940698868\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.03077265364207567,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.03077265364207567\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552013,\n\
\ \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552013\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096624,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096624\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6238532110091743,\n \"acc_stderr\": 0.02076923196820508,\n \"\
acc_norm\": 0.6238532110091743,\n \"acc_norm_stderr\": 0.02076923196820508\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5147058823529411,\n \"acc_stderr\": 0.035077938347913236,\n \"\
acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.035077938347913236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5822784810126582,\n \"acc_stderr\": 0.032103530322412685,\n \
\ \"acc_norm\": 0.5822784810126582,\n \"acc_norm_stderr\": 0.032103530322412685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5454545454545454,\n \"acc_stderr\": 0.045454545454545484,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.045454545454545484\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507738,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507738\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6181353767560664,\n\
\ \"acc_stderr\": 0.017373732736677583,\n \"acc_norm\": 0.6181353767560664,\n\
\ \"acc_norm_stderr\": 0.017373732736677583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377906,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377906\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808848,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808848\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n\
\ \"acc_stderr\": 0.028386198084177673,\n \"acc_norm\": 0.5144694533762058,\n\
\ \"acc_norm_stderr\": 0.028386198084177673\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3617992177314211,\n\
\ \"acc_stderr\": 0.012272736233262936,\n \"acc_norm\": 0.3617992177314211,\n\
\ \"acc_norm_stderr\": 0.012272736233262936\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.41013071895424835,\n \"acc_stderr\": 0.0198984127176359,\n \
\ \"acc_norm\": 0.41013071895424835,\n \"acc_norm_stderr\": 0.0198984127176359\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.37196774260485427,\n\
\ \"mc2_stderr\": 0.013504256751536046\n }\n}\n```"
repo_url: https://huggingface.co/stabilityai/stablelm-3b-4e1t
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-07-20.615284.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-07-20.615284.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-07-20.615284.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-07-20.615284.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_07_20.615284
path:
- results_2023-10-03T11-07-20.615284.parquet
- split: latest
path:
- results_2023-10-03T11-07-20.615284.parquet
---
# Dataset Card for Evaluation run of stabilityai/stablelm-3b-4e1t
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-3b-4e1t
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T11:07:20.615284](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t/blob/main/results_2023-10-03T11-07-20.615284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4534844875596275,
"acc_stderr": 0.035223600817914945,
"acc_norm": 0.457694087853883,
"acc_norm_stderr": 0.03521504058842905,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.37196774260485427,
"mc2_stderr": 0.013504256751536046
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.014397070564409172,
"acc_norm": 0.4658703071672355,
"acc_norm_stderr": 0.014577311315231104
},
"harness|hellaswag|10": {
"acc": 0.5622385978888668,
"acc_stderr": 0.00495097323118874,
"acc_norm": 0.7594104760007967,
"acc_norm_stderr": 0.004265678940698868
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.03077265364207567,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.03077265364207567
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552013,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552013
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6217616580310881,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.6217616580310881,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096624,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096624
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6238532110091743,
"acc_stderr": 0.02076923196820508,
"acc_norm": 0.6238532110091743,
"acc_norm_stderr": 0.02076923196820508
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.035077938347913236,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.035077938347913236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5822784810126582,
"acc_stderr": 0.032103530322412685,
"acc_norm": 0.5822784810126582,
"acc_norm_stderr": 0.032103530322412685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.045454545454545484,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.045454545454545484
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041696,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507738,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507738
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6181353767560664,
"acc_stderr": 0.017373732736677583,
"acc_norm": 0.6181353767560664,
"acc_norm_stderr": 0.017373732736677583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377906,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377906
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808848,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808848
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5144694533762058,
"acc_stderr": 0.028386198084177673,
"acc_norm": 0.5144694533762058,
"acc_norm_stderr": 0.028386198084177673
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3617992177314211,
"acc_stderr": 0.012272736233262936,
"acc_norm": 0.3617992177314211,
"acc_norm_stderr": 0.012272736233262936
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.41013071895424835,
"acc_stderr": 0.0198984127176359,
"acc_norm": 0.41013071895424835,
"acc_norm_stderr": 0.0198984127176359
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.37196774260485427,
"mc2_stderr": 0.013504256751536046
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-052e2b36-b4a1-40ab-a439-913db92ce77a | 2023-10-03T11:12:47.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
hanyansen/vldet_dataset | 2023-10-03T12:30:56.000Z | [
"region:us"
] | hanyansen | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-e6326aa4-87a6-479c-8e10-fe618b208ff7 | 2023-10-03T11:21:52.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.