id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
Yvaine0508/test | 2023-09-11T00:53:38.000Z | [
"region:us"
] | Yvaine0508 | null | null | null | 0 | 0 | Entry not found |
presencesw/c4_t5_gq5 | 2023-09-11T01:51:47.000Z | [
"region:us"
] | presencesw | null | null | null | 0 | 0 | Entry not found |
jsrdhher/imagetest | 2023-09-12T01:53:48.000Z | [
"region:us"
] | jsrdhher | null | null | null | 0 | 0 | Entry not found |
jan27/guanaco-llama2-1k | 2023-09-11T02:50:24.000Z | [
"region:us"
] | jan27 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
msi-pcr-sst/vg_command_en_es | 2023-09-11T03:10:10.000Z | [
"region:us"
] | msi-pcr-sst | null | null | null | 0 | 0 | Entry not found |
amitness/logits-mt-it-en-512 | 2023-09-11T12:18:19.000Z | [
"region:us"
] | amitness | null | null | null | 0 | 0 | Entry not found |
manycore-research/PlankAssembly | 2023-09-21T07:20:57.000Z | [
"size_categories:10K<n<100K",
"license:cc-by-nc-nd-4.0",
"arxiv:2308.05744",
"region:us"
] | manycore-research | null | null | null | 0 | 0 | ---
license: cc-by-nc-nd-4.0
size_categories:
- 10K<n<100K
---
# PlankAssembly Dataset
If you encounter downloading issue, you can directly download the dataset [here](https://manycore-research-azure.kujiale.com/manycore-research/PlankAssembly/data.zip).
## Dataset Description
- **Homepage:** https://manycore-research.github.io/PlankAssembly
- **Repository:** https://github.com/manycore-research/PlankAssembly
- **Paper:** https://arxiv.org/abs/2308.05744
### Dataset Summary
This is the dataset used for training [PlankAssembly](https://manycore-research.github.io/PlankAssembly). It contains 26,707 shape programs derived from parametric CAD models.
## Dataset Structure
PlankAssembly dataset is a directory with the following structure:
PlankAssemblyDataset
├── model # shape program
| └── <MODLE_ID>.json
└── splits # dataset splits
├── train.txt
├── valid.txt
└── test.txt
## PlankAssembly DSL
A cabinet is typically assembled by a list of plank models, where each plank is represented as an axis-aligned cuboid. A cuboid has six degrees of freedom, which correspond to the starting and ending coordinates along the three axes:
```
Cuboid (x_min, y_min, z_min, x_max, y_max, z_max).
```
Each coordinate can either take a numerical value or be a pointer to the corresponding coordinate of another cuboid (to which it attaches to).
In the parametric modeling software, a plank is typically created by first drawing a 2D profile and then applying the extrusion command. Thus, we categorize the faces of each plank into *sideface* or *endface*, depending on whether they are along the direction of the extrusion or not. Then, given a pair of faces from two different planks, we consider that an attachment relationship exists if (i) the two faces are within a distance threshold of 1mm and (ii) the pair consists of one sideface and one endface.
## Shape Program
Each shape program (*model.json*) is a JSON file with the following structure:
```python
{
# model id
"name": str,
# numerical values of all planks, the units are millimeters
"planks": List[List], # N x 6
# extrusion direction of each plank
"normal": List[List], # N x 3
# attachment relationships
# -1 denotes no attachment relationship
# Others denote the index of the flattened plank sequence
"attach": List[List], # N x 6
}
```
## BibTex
Please cite our paper if you use PlankAssembly dataset in your work:
```bibtex
@inproceedings{PlankAssembly,
author = {Hu, Wentao and Zheng, Jia and Zhang, Zixin and Yuan, Xiaojun and Yin, Jian and Zhou, Zihan},
title = {PlankAssembly: Robust 3D Reconstruction from Three Orthographic Views with Learnt Shape Programs},
booktitle = {ICCV},
year = {2023}
}
``` |
yjching/tokenized_dialogsum | 2023-09-11T04:55:20.000Z | [
"region:us"
] | yjching | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 76653920
num_examples: 12460
- name: validation
num_bytes: 3076000
num_examples: 500
- name: test
num_bytes: 9228000
num_examples: 1500
download_size: 5347174
dataset_size: 88957920
---
# Dataset Card for "tokenized_dialogsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zloading/outdated_dataset | 2023-09-11T05:32:57.000Z | [
"region:us"
] | zloading | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 133852.0
num_examples: 10
download_size: 132701
dataset_size: 133852.0
---
# Dataset Card for "outdated_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marasama/nva-Galatea | 2023-09-11T04:30:21.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
sn2222/Workflow | 2023-09-11T05:14:09.000Z | [
"region:us"
] | sn2222 | null | null | null | 0 | 0 | Entry not found |
kalhosni/Llama2_Dataset1 | 2023-09-11T06:35:37.000Z | [
"license:apache-2.0",
"region:us"
] | kalhosni | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
bombbomb1980/test_data | 2023-09-11T05:33:15.000Z | [
"region:us"
] | bombbomb1980 | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_001 | 2023-09-11T06:15:45.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_003 | 2023-09-11T06:22:29.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_004 | 2023-09-11T06:24:01.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
Alex7756/mix-big-0909 | 2023-09-11T06:29:36.000Z | [
"license:other",
"region:us"
] | Alex7756 | null | null | null | 0 | 0 | ---
license: other
---
|
mindchain/demo_005 | 2023-09-11T06:25:40.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
Alex7756/gf_0909 | 2023-09-11T06:34:51.000Z | [
"license:other",
"region:us"
] | Alex7756 | null | null | null | 0 | 0 | ---
license: other
---
|
Dumspiro/Data01_public_uni | 2023-09-11T06:40:30.000Z | [
"region:us"
] | Dumspiro | null | null | null | 0 | 0 | Data collection of common university information
such as "What is the main admission process of university?"
the common sense of "how to get into the universtiy" |
mindchain/demo_007 | 2023-09-11T06:48:28.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
mindchain/demo_008 | 2023-09-11T07:02:25.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
MikeXydas/iris | 2023-09-12T09:40:19.000Z | [
"license:mit",
"region:us"
] | MikeXydas | null | null | null | 0 | 0 | ---
license: mit
---
|
bhawesh155/llama-train-demo-dataset | 2023-09-11T07:20:52.000Z | [
"region:us"
] | bhawesh155 | null | null | null | 0 | 0 | Entry not found |
malaysia-ai/1media.my | 2023-09-11T07:45:40.000Z | [
"region:us"
] | malaysia-ai | null | null | null | 0 | 0 | Entry not found |
bongo2112/my-auto11-outputs-v1 | 2023-09-11T07:58:49.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
malteee/SynTruckFrame | 2023-09-11T08:41:08.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
- name: bbox
sequence: float64
splits:
- name: train
num_bytes: 169576294.0
num_examples: 171
download_size: 99098836
dataset_size: 169576294.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SynTruckFrame"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malteee/SynTruckPlatform | 2023-09-11T08:18:14.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | Entry not found |
shubh0493/TaxAI | 2023-09-11T08:37:06.000Z | [
"license:mit",
"region:us"
] | shubh0493 | null | null | null | 0 | 0 | ---
license: mit
---
|
MiLab-HITSZ/SurrogateAssistedFramework | 2023-09-11T09:44:53.000Z | [
"license:mit",
"region:us"
] | MiLab-HITSZ | null | null | null | 0 | 0 | ---
license: mit
---
|
malteee/SynTuckPlatform | 2023-09-11T08:41:21.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
- name: bbox
sequence: float64
splits:
- name: train
num_bytes: 167539793.0
num_examples: 169
download_size: 98942920
dataset_size: 167539793.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SynTuckPlatform"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YoussefThabet/GovChat | 2023-09-11T09:14:02.000Z | [
"region:us"
] | YoussefThabet | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 3620222
num_examples: 3980
download_size: 0
dataset_size: 3620222
---
# Dataset Card for "GovChat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chuyin0321/news-stocks | 2023-09-11T08:49:57.000Z | [
"region:us"
] | chuyin0321 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: body
dtype: string
- name: publisher
dtype: string
- name: publish_time
dtype: timestamp[ns, tz=GMT]
- name: title
dtype: string
- name: url
dtype: string
- name: uuid
dtype: string
splits:
- name: train
num_bytes: 18435776
num_examples: 2978
download_size: 9157427
dataset_size: 18435776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "news-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chuyin0321/timeseries-1mn-stocks | 2023-09-11T08:52:37.000Z | [
"region:us"
] | chuyin0321 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: datetime
dtype: timestamp[ns]
- name: open
dtype: float64
- name: high
dtype: float64
- name: low
dtype: float64
- name: close
dtype: float64
- name: volume
dtype: float64
splits:
- name: train
num_bytes: 21219505
num_examples: 378090
download_size: 15092332
dataset_size: 21219505
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "timeseries-1mn-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YXStableDiffusion/YXSD | 2023-09-11T09:10:55.000Z | [
"region:us"
] | YXStableDiffusion | null | null | null | 0 | 0 | Entry not found |
YXStableDiffusion/YXESR | 2023-09-11T09:12:53.000Z | [
"region:us"
] | YXStableDiffusion | null | null | null | 0 | 0 | Entry not found |
kalhosni/CustomerChurnTelecom | 2023-09-11T09:34:15.000Z | [
"license:apache-2.0",
"region:us"
] | kalhosni | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
philippemo/test-cc | 2023-09-11T09:24:29.000Z | [
"region:us"
] | philippemo | null | null | null | 0 | 0 | Entry not found |
mindchain/tttttwer | 2023-09-11T09:41:42.000Z | [
"region:us"
] | mindchain | null | null | null | 0 | 0 | Entry not found |
NLPLSS/Articles | 2023-09-11T12:24:14.000Z | [
"region:us"
] | NLPLSS | null | null | null | 0 | 0 | Entry not found |
gendisjawi/golang | 2023-09-11T10:06:18.000Z | [
"region:us"
] | gendisjawi | null | null | null | 1 | 0 | Entry not found |
kirillsev1/detect_human | 2023-09-11T10:10:53.000Z | [
"region:us"
] | kirillsev1 | null | null | null | 0 | 0 | Entry not found |
Haagen-Dazs/Objaverse-MIX | 2023-09-12T15:13:12.000Z | [
"license:openrail",
"region:us"
] | Haagen-Dazs | null | null | null | 0 | 0 | ---
license: openrail
---
|
goodfellowliu/StarGAN_CelebA | 2023-09-11T10:25:36.000Z | [
"license:apache-2.0",
"region:us"
] | goodfellowliu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
massdhohd/CAXAS | 2023-09-11T12:37:38.000Z | [
"region:us"
] | massdhohd | null | null | null | 0 | 0 | Entry not found |
Lhtie/Bio-Domain-Transfer | 2023-09-11T10:51:59.000Z | [
"region:us"
] | Lhtie | null | null | null | 0 | 0 | Entry not found |
davanstrien/SFconvertbot-stats | 2023-10-03T09:15:04.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: createdAt
dtype: timestamp[us]
- name: pr_number
dtype: int64
- name: status
dtype: large_string
- name: repo_id
dtype: large_string
- name: type
dtype: large_string
- name: isPullRequest
dtype: bool
splits:
- name: train
num_bytes: 3437090
num_examples: 40258
download_size: 1413838
dataset_size: 3437090
---
# Dataset Card for "SFconvertbot-stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnimaleMaleEnhancementVenezuela/AnimaleMaleEnhancementVenezuela | 2023-09-11T11:00:05.000Z | [
"region:us"
] | AnimaleMaleEnhancementVenezuela | null | null | null | 0 | 0 | <h2><strong>Página oficial de Facebook:</strong></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<h3>✅ Nombre del artículo: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Animale Male Enhancement Venezuela</span></strong></a></h3>
<h3>✅ Beneficios: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Impulso sexual más fuerte</span></strong></a></h3>
<h3>✅ Cantidad: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">30 Pastillas</span></strong></a></h3>
<h3>✅ Clasificación: <strong>★★★★☆ (4.5/5.0)</strong></h3>
<h3>✅ Oferta: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Promesa incondicional de 90 días</span></strong></a></h3>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="font-size: 18px;">Animale Male Enhancement Venezuela</span></strong></a><span style="font-size: 18px;"><strong>,</strong> las píldoras de mejora masculina más potentes del mercado en este momento</span></p>
<p>Como un medio para aumentar su confianza y tener encuentros sexuales más satisfactorios, los hombres dan prioridad al desarrollo de su vida sexual. Sin embargo, muchos hombres recurren a estrategias que son ineficaces e incluso pueden ser perjudiciales para su salud a largo plazo debido a la falta de conocimiento y experiencia. Dado que este es el caso, nos gustaría informarle sobre Animale Male Enhancement Venezuela, las píldoras de mejora masculina más potentes en el mercado en este momento. Estas gomitas se producen a partir de ingredientes totalmente naturales y son seguras para cualquier persona interesada en mejorar su rendimiento sexual.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/SsB5pxY/Animale-Male-Enhancement-Venezuela-Capsules.png" alt="Animale-Male-Enhancement-Venezuela-Capsules" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<p>Además, tienen una selección de bonificaciones que, si se usan, pueden mejorar su masculinidad de varias maneras. Como resultado, no necesita ir más lejos si ha estado buscando un método natural para aumentar su libido y mejorar su desempeño sexual.</p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;"><strong><span style="font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">CONSULTE EL PRECIO DE DESCUENTO DISPONIBLE TOCANDO AQUÍ SITIO WEB OFICIAL</span></strong></span></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Descripción del Producto</span></strong></p>
<p>Las píldoras <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Animale Male Enhancement Venezuela</strong></a><strong> </strong>son una opción popular entre aquellos que desean aumentar su rendimiento sexual. Los caramelos de goma infundidos con son una forma natural de mejorar el rendimiento sexual y la resistencia. Son convenientes ya que puede tomarlos cuando lo desee sin temor a efectos secundarios negativos. La mejor parte es que puedes comenzar a sentir los resultados minutos después de tomarlos.</p>
<p><strong><span style="font-size: 22px;">Los beneficios de usar Animale Male Enhancement Venezuela como un medio para mejorar el rendimiento sexual</span></strong></p>
<p>El potencial de salvar vidas de las tabletas de mejora masculina es sin duda razonable. ¿Qué es más útil que las gomitas de para aumentar la libido? Estas píldoras son efectivas y seguras de usar ya que incluyen Male Enhancement Venezuela, un afrodisíaco natural.</p>
<p>Además, son veganos y no contienen gluten, lo que los convierte en una excelente opción para cualquiera que necesite seguir dietas especiales. Las cápsulas son fáciles de usar sobre la marcha ya que se disuelven rápidamente en la boca. Todo hombre que desee alcanzar su máximo potencial sexual debe incluir gomitas de para potenciar su zona erógena.</p>
<p><strong><span style="font-size: 22px;">Erecciones de mayor potencia</span></strong></p>
<p>Tener relaciones sexuales es placentero, no hay duda al respecto, y es posible divertirse mucho si las cosas van bien. Alternativamente, si tiene problemas en el dormitorio, podría ser una señal de que está luchando con algo más serio, como ansiedad o depresión. Si está buscando una forma natural de mejorar su rendimiento sexual, el puede ser la respuesta que estaba buscando. La evidencia sugiere que el <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Male Enhancement Venezuela</strong></a> aumenta el flujo de sangre al pene, lo que significa que se bombea más sangre al pene durante las relaciones sexuales. También es útil para tratar problemas de salud mental como la ansiedad y la depresión, que pueden tener un impacto negativo en el rendimiento sexual. Como conclusión, las gominolas de se pueden consumir simplemente masticándolas y tragándolas como dulces.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/C6fPph6/Animale-Male-Enhancement-Venezuela.png" alt="Animale-Male-Enhancement-Venezuela" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">VISITA EL SITIO WEB OFICIAL PARA COMPRAR TU BOTELLA CONSIGUELO AHORA</span></strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Una mayor necesidad de tener contacto sexual</span></strong></p>
<p>El impulso de participar en la actividad sexual tiende a aumentar en momentos de estrés o preocupación intensos. Por otro lado, las gomitas de Male Enhancement Venezuela están destinadas a aliviar la angustia de los problemas sexuales al aumentar la libido y el rendimiento en el dormitorio. Lo hacen al reducir los niveles de estrés y ansiedad y aumentar el flujo sanguíneo, lo que resulta en erecciones más poderosas. Las gomitas son perfectas para aliviar el estrés después de un largo día en el trabajo, ya que no solo aumentan la libido sino que también te ayudan a relajarte. Los mejores resultados se pueden lograr simplemente colocándose uno en la boca antes de acostarse. Deberías esperarlo.</p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></strong></a></p>
<p><span style="font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;"><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></span></p>
<p><span style="font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;"><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></span></p>
<p><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000; font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Sentido de confianza mejorado</span></strong></p>
<p>El <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Male Enhancement Venezuela</strong></a> en su forma más pura y natural se usa para hacer estas delicias del tamaño de un bocado. Son útiles para aumentar la seguridad y la eficiencia sexual. Todo lo que se necesita para recibir un respiro del estrés y las preocupaciones que podría generar es meterse uno en la boca y dejar que suceda la magia. Si está pasando por un momento difícil o simplemente quiere un descanso de sus preocupaciones normales, se sentirá mejor después de unos minutos de este pasatiempo.</p>
<p>Las gomitas de Male Enhancement Venezuela son muy recomendables en tal situación, ya que incluyen ingredientes naturales que estimulan la circulación y la resistencia. Además, no hay ningún peligro al usarlos y no hay consecuencias no deseadas. Dos o tres gomitas consumidas poco antes de acostarse pueden aumentar su libido y mantenerlo fuerte durante toda la noche.</p>
<p><strong><span style="font-size: 22px;">aceite de Male Enhancement Venezuela</span></strong></p>
<p>Si está buscando una forma natural y efectiva de aumentar su rendimiento sexual, las gomitas de son una excelente alternativa para explorar. Estos caramelos son de confianza ya que solo utilizan los mejores ingredientes (aceite de y sabores naturales). Además, todo lo que tienes que hacer para tomar uno es meterlo en tu boca.</p>
<p><strong><span style="font-size: 22px;">L-arginina</span></strong></p>
<p>Las Gomitas de con L-Arginina son un artículo esencial para cualquier régimen de mejora masculina. Es una sustancia nitrogenada que se ha demostrado que aumenta el flujo sanguíneo y mejora la libido. El óxido nítrico, producido cuando se descompone la l-arginina, dilata los vasos sanguíneos y aumenta el deseo sexual de una persona. Se espera que los ingredientes adicionales de estos ositos de goma mejoren su rendimiento sexual.</p>
<p><strong><span style="font-size: 22px;">Fruto de la palma enana americana</span></strong></p>
<p>La baya del árbol de la palma enana americana tiene una larga historia de uso para impulsar la producción natural de testosterona. Además, es un ingrediente clave en las populares Gummies for Male Enhancement, que han demostrado aumentar la libido y el rendimiento masculino en la cama. El Male Enhancement Venezuela es un compuesto que se encuentra en el cannabis. Elevará sus niveles de testosterona y también: Las bayas de la palma enana americana incluyen una variedad de nutrientes clave adicionales, como las vitaminas B6 y B12, que contribuyen al mantenimiento y mejora de la salud y el rendimiento sexual. Estos dulces también incluyen hierba de cabra en celo, un tipo de planta que se ha demostrado que aumenta la libido y la eficacia de un hombre en la cama.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/VvNv7h0/Animale-Male-Enhancement-Venezuela-PIlls.png" alt="Animale-Male-Enhancement-Venezuela-PIlls" border="0" /></a><br /><br /><br /></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">CONSULTE EL PRECIO DE DESCUENTO DISPONIBLE TOCANDO AQUÍ SITIO WEB OFICIAL</span></strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">The Termite, gomitas de mejora masculina Animale</span></strong></p>
<p>Se sabe desde hace siglos que la planta Tribulusterrestris puede mejorar tanto la salud como el rendimiento sexual, razón por la cual se ha incluido en la medicina tradicional china. También se ha demostrado que aumenta los niveles de testosterona, lo que puede explicar por qué algunos hombres reportan una mayor libido y erecciones después de usarlo. Hay una falta de citas para esta sección. Los componentes activos de las gominolas de tribulusterrestris son más eficaces cuando se consumen justo antes de la noche, para que puedas sacarles el máximo partido. Como beneficio adicional, Tribulus Terrestris se puede utilizar como refrigerio nocturno; simplemente disfrútalos sin miedo a las consecuencias negativas.</p>
<p><strong><span style="font-size: 22px;">Eurycoma frondoso de espolón largo</span></strong></p>
<p>¿Busca un enfoque más orgánico para mejorar su rendimiento sexual? Examina la EurycomaLongifolia que forma la base de estos caramelos de . Estas terapias ayudan a aumentar el flujo de sangre al pene, lo que a su vez mejora el rendimiento sexual y el disfrute sexual en general. Debido a la facilidad con la que se pueden consumir, puede tener uno en cualquier momento que elija sin sentir una culpa excesiva.</p>
<p><strong><span style="font-size: 22px;">Con la ayuda de Turbo <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Animale Male Enhancement Venezuela</strong></a>, ¿cómo puede mejorar su rendimiento sexual?</span></strong></p>
<p>Una de las ventajas potenciales de las tabletas de mejora masculina es la mejora del rendimiento sexual. Las gomitas de son populares porque son cómodas de usar y tienen pocos efectos secundarios negativos, si es que tienen alguno. Tan pronto como los incluya en su rutina habitual, comenzará a cosechar los beneficios que brindan. Cuando se usan con otros medicamentos para el mejoramiento masculino, como potenciadores de testosterona o medicamentos para la disfunción eréctil, brindan resultados aún mejores. Dime, entonces: ¿qué esperas que suceda? Obtenga algunas gominolas de ahora y vea si funcionan para usted.</p>
<p><strong><span style="font-size: 22px;">Información sobre los ositos de gominola de mejora masculina Animale</span></strong></p>
<p>Tomar en forma de gomitas de mejora masculina es una manera conveniente y fácil de aumentar potencialmente el rendimiento sexual. Son una gran opción para aquellos que no quieren que su medicamento tenga un sabor desagradable ya que no tienen olor ni sabor. Puede estar seguro de que las gomitas son útiles y seguras de tomar, ya que el es un químico natural que se puede encontrar en una variedad de diferentes variedades de cannabis. La evidencia sugiere que el puede ayudar a las personas con problemas de salud sexual, incluida la reducción de la libido o la disfunción eréctil. Por lo tanto, las gominolas de son la mejor opción si buscas un método para mejorar tu rendimiento sexual.</p>
<p><strong><span style="font-size: 22px;">¿Exactamente cómo funciona el realce masculino Animale?</span></strong></p>
<p>Existe una tendencia creciente de usar gominolas de para mejorar el sexo masculino. El es un compuesto que se ha descubierto en el cannabis y se ha relacionado con beneficios terapéuticos positivos en humanos. El se puede encontrar en estos dulces masticables. Implica ayudar a los hombres a superar problemas como la disfunción eréctil (DE) y otros que son exclusivos de ellos. Fáciles de tomar, las gomitas ayudan a los hombres con una variedad de problemas específicos masculinos, incluida la disfunción eréctil (DE). Puede disfrutarlos sin preocuparse por los efectos secundarios negativos o las combinaciones de medicamentos con otras terapias que pueda estar recibiendo, ya que son inofensivos y discretos.</p>
<p><strong><span style="font-size: 22px;">Conclusión</span></strong></p>
<p>¿Está buscando un método para mejorar su rendimiento sexual que sea a la vez natural y muy eficaz? ¡Estás de suerte si ese es el caso! Nuestro personal experto ha llevado a cabo investigaciones y análisis considerables, y han llegado a la conclusión de que Animale Male Enhancement Venezuelaar son las mejores gomitas de del mercado en este momento. Estos dulces gomosos están hechos para ayudar a los hombres de cualquier edad a mejorar su desempeño sexual, y te alegrará saber que también tienen muchos más beneficios. Además, nuestras sencillas instrucciones le mostrarán cómo utilizar Animale Male Enhancement Venezuela en todo su potencial.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/d6fy5cc/Animale-Male-Venezuela.png" alt="Animale-Male-Venezuela" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Nuestros blogs oficiales ⇒</strong></span></h3>
<p><a href="https://animale-male-enhancement-venez-08a14d.webflow.io/"><strong>https://animale-male-enhancement-venez-08a14d.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-92964b.webflow.io/"><strong>https://animale-male-enhancement-ve-92964b.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.webflow.io/"><strong>https://animalemaleenhancementvenezuela.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementve.webflow.io/"><strong>https://animalemaleenhancementve.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.mystrikingly.com/"><strong>https://animale-male-enhancement-venezuela.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.mystrikingly.com/"><strong>https://animale-male-enhancement-ve.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.mystrikingly.com/"><strong>https://animalemaleenhancementvenezuela.mystrikingly.com/</strong></a></p>
<p><a href="http://animalemaleenhancementve.mystrikingly.com/"><strong>http://animalemaleenhancementve.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela-1.jimdosite.com/"><strong>https://animale-male-enhancement-venezuela-1.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-1.jimdosite.com/"><strong>https://animale-male-enhancement-ve-1.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jimdosite.com/"><strong>https://animalemaleenhancementvenezuela.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jimdosite.com/"><strong>https://animalemaleenhancementve.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venez.godaddysites.com/"><strong>https://animale-male-enhancement-venez.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.godaddysites.com/"><strong>https://animale-male-enhancement-ve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezue1.godaddysites.com/"><strong>https://animalemaleenhancementvenezue1.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.godaddysites.com/"><strong>https://animalemaleenhancementve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancement-venezuela.jigsy.com/"><strong>https://animalemaleenhancement-venezuela.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.jigsy.com/"><strong>https://animale-male-enhancement-ve.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jigsy.com/"><strong>https://animalemaleenhancementvenezuela.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jigsy.com/"><strong>https://animalemaleenhancementve.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.company.site/"><strong>https://animale-male-enhancement-venezuela.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.company.site/"><strong>https://animale-male-enhancement-ve.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementinvenezuela.company.site/"><strong>https://animalemaleenhancementinvenezuela.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementve.company.site/"><strong>https://animalemaleenhancementve.company.site/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-ve/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-in-ve/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD"><strong>https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD"><strong>https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement New Zealand Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOfNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOfNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Búsquedas recientes :</span></strong></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVE</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVEShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVEOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVELegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuela</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuelaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaLegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaScam</strong></a></p> |
AnimaleMaleEnhancementVenezuela/AnimaleMaleEnhancementVE | 2023-09-11T11:00:40.000Z | [
"region:us"
] | AnimaleMaleEnhancementVenezuela | null | null | null | 0 | 0 | <h2><strong>Página oficial de Facebook:</strong></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<h3>✅ Nombre del artículo: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Animale Male Enhancement Venezuela</span></strong></a></h3>
<h3>✅ Beneficios: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Impulso sexual más fuerte</span></strong></a></h3>
<h3>✅ Cantidad: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">30 Pastillas</span></strong></a></h3>
<h3>✅ Clasificación: <strong>★★★★☆ (4.5/5.0)</strong></h3>
<h3>✅ Oferta: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Promesa incondicional de 90 días</span></strong></a></h3>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="font-size: 18px;">Animale Male Enhancement Venezuela</span></strong></a><span style="font-size: 18px;"><strong>,</strong> las píldoras de mejora masculina más potentes del mercado en este momento</span></p>
<p>Como un medio para aumentar su confianza y tener encuentros sexuales más satisfactorios, los hombres dan prioridad al desarrollo de su vida sexual. Sin embargo, muchos hombres recurren a estrategias que son ineficaces e incluso pueden ser perjudiciales para su salud a largo plazo debido a la falta de conocimiento y experiencia. Dado que este es el caso, nos gustaría informarle sobre Animale Male Enhancement Venezuela, las píldoras de mejora masculina más potentes en el mercado en este momento. Estas gomitas se producen a partir de ingredientes totalmente naturales y son seguras para cualquier persona interesada en mejorar su rendimiento sexual.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/SsB5pxY/Animale-Male-Enhancement-Venezuela-Capsules.png" alt="Animale-Male-Enhancement-Venezuela-Capsules" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Nuestros blogs oficiales ⇒</strong></span></h3>
<p><a href="https://animale-male-enhancement-venez-08a14d.webflow.io/"><strong>https://animale-male-enhancement-venez-08a14d.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-92964b.webflow.io/"><strong>https://animale-male-enhancement-ve-92964b.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.webflow.io/"><strong>https://animalemaleenhancementvenezuela.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementve.webflow.io/"><strong>https://animalemaleenhancementve.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.mystrikingly.com/"><strong>https://animale-male-enhancement-venezuela.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.mystrikingly.com/"><strong>https://animale-male-enhancement-ve.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.mystrikingly.com/"><strong>https://animalemaleenhancementvenezuela.mystrikingly.com/</strong></a></p>
<p><a href="http://animalemaleenhancementve.mystrikingly.com/"><strong>http://animalemaleenhancementve.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela-1.jimdosite.com/"><strong>https://animale-male-enhancement-venezuela-1.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-1.jimdosite.com/"><strong>https://animale-male-enhancement-ve-1.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jimdosite.com/"><strong>https://animalemaleenhancementvenezuela.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jimdosite.com/"><strong>https://animalemaleenhancementve.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venez.godaddysites.com/"><strong>https://animale-male-enhancement-venez.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.godaddysites.com/"><strong>https://animale-male-enhancement-ve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezue1.godaddysites.com/"><strong>https://animalemaleenhancementvenezue1.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.godaddysites.com/"><strong>https://animalemaleenhancementve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancement-venezuela.jigsy.com/"><strong>https://animalemaleenhancement-venezuela.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.jigsy.com/"><strong>https://animale-male-enhancement-ve.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jigsy.com/"><strong>https://animalemaleenhancementvenezuela.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jigsy.com/"><strong>https://animalemaleenhancementve.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.company.site/"><strong>https://animale-male-enhancement-venezuela.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.company.site/"><strong>https://animale-male-enhancement-ve.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementinvenezuela.company.site/"><strong>https://animalemaleenhancementinvenezuela.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementve.company.site/"><strong>https://animalemaleenhancementve.company.site/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-ve/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-in-ve/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD"><strong>https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD"><strong>https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement New Zealand Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOfNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOfNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Búsquedas recientes :</span></strong></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVE</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVEShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVEOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVELegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuela</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuelaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaLegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaScam</strong></a></p> |
muscle-monster/muscle-monster-jack-hammer-libido-booster | 2023-09-11T11:06:05.000Z | [
"region:us"
] | muscle-monster | null | null | null | 0 | 0 | ➥**Product Name** – [Muscle Monster Jack Hammer Libido Booster](https://muscle-monster-jack-hammer-libido-booster.jimdosite.com/)
➥**Composition** – Natural Component
➥**Category** – Male Enhancement
➥**Rating** – ★★★★✰
➥**Purchase Access** – Only On [Official Website](https://www.healthsupplement24x7.com/get-jack-hammer)
➥**Official Website** – [https://www.healthsupplement24x7.com/get-jack-hammer](https://www.healthsupplement24x7.com/get-jack-hammer)
[Jack Hammer Libido Booster](https://soundcloud.com/jack-hammer-libido-booster-529116114/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers) formula is a highly effective remedy for men’s issues that has already impacted the lives of thousands of men throughout the world. These capsules have a variety of the most potent ingredients for increased efficacy. The first sign in men is simply erectile dysfunction. Fortunately, these products are capable of efficiently dealing with such issues.
[](https://www.healthsupplement24x7.com/get-jack-hammer)
### _**[Visit the Official Website of Jack Hammer Libido Booster!!!](https://www.healthsupplement24x7.com/get-jack-hammer)**_
**What Is Jack Hammer Libido Booster?**
---------------------------------------
[**Jack Hammer Libido Booster**](https://pdfhost.io/v/InNJzmxDm_Muscle_Monster_Jack_Hammer_Is_Libido_Booster_Performance_And_Stamina_Enhancers) is a male enhancement matter formulated with an intermix of clinical powerfulness ingredients that assist encouragement the 3S’s of S3x (Situation, Stamina, and Satisfaction). Thanks to its dual-action direction, can also aid with the structure effort of ED and new S3xual dysfunctions to return your S3xual power and action.
[Jack Hammer Libido Booster](https://groups.google.com/g/get-jack-hammer/c/9pg4sZPuxHw) adopts fast sorption and outstretched promulgation discipline to raise the supplement’s effectualness. The fast absorption of the product’s existing ingredients into the user’s bloodstream would finish in a present S3x travel and push run. At the unvaried measure, the extended-release technology would secure uninterrupted results of the increment.
**How does Jack Hammer Libido Booster Work?**
---------------------------------------------
This male enhancement supplement works to maximize natural sexual response according to which you simply becomes sexually active & promises to deliver the best performance during arousing moments without any side effects. The functioning of this male enhancement simply targets the core causes of erectile dysfunction & low libido as aging could be the most primary reason why men lack sexual desires & affect men virility system.
To maintain the sufficient levels of testosterone levels in men this supplement proves to be trustworthy. To give harder & long lasting erection it releases molecules to stimulate higher blood flow to penile shaft. To heighten the sexual power, stamina, long lasting intercourse formula it elevates HGH(Human Growth Hormones).
[.png)](https://www.healthsupplement24x7.com/get-jack-hammer)
**Main benefits of Jack Hammer Libido Booster**
-----------------------------------------------
[Jack Hammer Libido Booster](https://colab.research.google.com/drive/1uBFdkYyNl9a-LZBfoSjkNc4tR6giacYN) may act s3xual health benefits to men, all towards experiencing s3xual force, pleasure, and execution. Here are the prima s3xual eudaimonia benefits you get from the use of this [Jack Hammer Libido Booster](https://form.jotform.com/Jack_Hammer_Libido/muscle-monster-jack-hammer) increase:
**Increment libido and s3x travel**: It can ameliorate amount and restores your s3xual healthiness and performance and assist your s3xual desire and excruciation.
**Large staying conference:** The intermix of bustling ingredients in It can help win human s3xual example. According to the business, can increment gore feed to the penile expanse, making your ending team nowadays long than familiar.
**Large construction**: This contains several ingredients that are dedicated to ensuring harder erections. The inflated nitrogen oxide and blood travel to the penile cavum module forbear you like a blissful s3xual conference with your relative. Can also refrain increase the situation of your phallus by length and dimension.
**Developed sureness**: Because of low s3xual performance, numerous people someone peoples their s3xual sureness. If you use this matter and get the advertised benefits, you can get your s3xual authority.
**Present Zoom in Libido and s3x Journey**: It helps recharge the s3xual push that has been stored crosswise your embody to let you undergo a thick, blissful, and efficacious s3x invigoration again.
**Endless Lasting Erections**: It helps boost the motion of gore to the penis for soul erections. This way, you and your relation would get savor schizophrenic s3xual sessions consistently.
**Amount in Phallus Situation**: It helps increase the phallus designer for hyperbolic execution holding volume. This mechanism, combined with a regular hemorrhage of gore to your phallus, would ply increase your penis’s entire counterbalance (size and fasten). This way, you won’t know to cark most your phallus size, and you’ll also get to say goodbye to for intellectual.
**Rise in s3xual Certainty**: It is formulated with ingredients familiar to improve youthful powers and forcefulness. This gives you a 2nd chance to savor regnant s3x experience suchlike you did in your 20’s. With this male enhancement increment, you get to lift your s3xual certainty and be boastful of yourself as a proper man in the room with your coveted caucasian.
**Long Staying State**: May assist cipher the problems of immature ejaculations by overflowing your penile designer with praise of slaying. This mechanism lets you measure 5 present someone than regular whenever you desire to vow in s3xual s3x with your mate.
[.png)](https://www.healthsupplement24x7.com/get-jack-hammer)
### _**[Click Here to Get Muscle Monster Jack Hammer for an Exclusive Discounted Price](https://www.healthsupplement24x7.com/get-jack-hammer)**_
**What Ingredients Make Muscle Monster Jack Hammer So Powerful?**
-----------------------------------------------------------------
**Magnesium:** One of the ingredients added to the blend is an essential trace mineral that is involved in enhancing several processes in the body. As a trace mineral, this ingredient is **important for improving blood pressure**, reducing the frequency of headaches, and improving your sleep quality.
**Tribulus Terrestris:** The main use of this ingredient is improving blood flow, enhancing the health of your urinary tract, reducing inflammation, and increasing the production of a crucial male hormone. Additionally, some studies suggest that this ingredient is beneficial for eliminating the risk of heart problems, improving your physical performance, and increasing your levels of energy.
**Chrysin:** It is an important ingredient for enhancing male health. It’s because of its ability to reduce the production of aromatase to reduce the levels of estrogen in your body. According to research, this ingredient is also important for reducing cellular damage caused by free radicals as it contains important antioxidants.
**Epimedium Sagittatum:** The main function of this ingredient is to improve blood flow throughout the body. This is to make sure that any of your organs, at a given time, function effectively and without any difficulty. Finally, it is great for reducing pain and fatigue as well as **maintaining your overall health**.
**Tongkat Ali:** This is because it contains beneficial compounds that are great for enhancing overall health. The reason that it has been added to the formula is that an **_increases energy levels naturally, tone your muscles, and improves your metabolic speed_**. In addition to that, the ingredient also improves energy levels which enables you to perform different activities with ease throughout the day.
**Saw Palmetto Berries:** Saw Palmetto Berries are rich in antioxidants which help to reduce cellular damage and maintain the health of your organs. Additionally, it has proven effects in eliminating certain health conditions unique in men and also reduces swelling and pain in different parts of your body. The anti-inflammatory agent is great for increasing the levels of your natural male hormone.
**Chinese Hawthorn:** This ingredient is one of the key elements of this formula. The Chinese Hawthorn Berry is **important for improving your blood circulation**. This in turn helps to lower the risk of high blood pressure as well as high cholesterol levels. By increasing your blood flow, your body can perform several functions easily and also **improve various aspects of male health effectively**. Finally, this ingredient is also an anti-inflammatory which improves the health of internal organs and maintains their functions.
**Winged Trebine:** The ingredient is well known for the different antioxidants that it possesses. This **helps to eliminate toxins and free radicals from your body and rejuvenate the health** and well-being of your internal system. Additionally, the ingredient has been found to improve your bone density as well as reduce the risk of fractures and joint pain.
[.png)](https://www.healthsupplement24x7.com/get-jack-hammer)
### _**[HURRY! Stocks on Demand! Choose the Best Deals Before the Offer Ends](https://www.healthsupplement24x7.com/get-jack-hammer)**_
**Exactly How to Use Jack Hammer Libido Booster?**
--------------------------------------------------
It charges a lot for virility by adding the [Jack Hammer Libido Booster](https://www.sympla.com.br/evento/muscle-monster-jack-hammer---is-libido-booster-performance-and-stamina-enhancers/2154870) augmentation to your familiar taxon. Every period to month apportioning of this charm issue has 60 pills where the customer needs to strike 2 pills for the time of the day. Customers are recommended to sustain these pills for principal mins antecedent to having intercourse with a provide brimming with a thing. In the provoke of intake for 15 processes, you faculty statesman to show up author grounded and seek in your embody. Accordingly, quotidian suppers without author than 2-3 months component construct your sexy level and also curb the country of untimely circulation.
**Jack Hammer Libido Booster** **Side Effects**
-----------------------------------------------
The male enhancement formula is a natural ingredients-based dietary supplement. [Jack Hammer Libido Booster](https://devfolio.co/projects/muscle-monster-jack-hammer-8787) supplement is a substance that, according to experts, is completely safe for the body and does not cause any harm. There are no negative side effects.
**What is the cost of Muscle Monster Jack Hammer?**
---------------------------------------------------
Blood circulation is an important aspect of your life as it provides blood throughout the body for different organs to function effectively.
[Muscle Monster Jack Hammer](https://lexcliq.com/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers/) is a blood circulation enhancer that has been made available for purchase on its official website.
Different packs are available at different discounts making the product very affordable. Here is a glimpse of the price ranges for Muscle Monster Jack Hammer:
* _**1 BOTTLE $79 + Free Shipping Included!**_
* _**3 BOTTLES SALE PRICE $177 + Free Shipping Included!**_
* _**6 BOTTLES SALE PRICE $294/Price Per Bottle: $49 + Free Shipping Included!
**_
### [.png)](https://www.healthsupplement24x7.com/get-jack-hammer)
### _**[(DO NOT MISS OUT) Order Now and Get Special Discount Savings While Supplies Last!](https://www.healthsupplement24x7.com/get-jack-hammer)**_
**Money Back Guarantee**
------------------------
[Muscle Monster Jack Hammer](https://www.ivoox.com/muscle-monster-jack-hammer-is-libido-booster-audios-mp3_rf_115777891_1.html) comes with a generous 365-days money-back guarantee, reflecting the manufacturer's confidence in the product's effectiveness. This guarantee ensures that customers have ample time to try [Muscle Monster Jack Hammer](https://jackhammer.clubeo.com/page/jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html) and assess its benefits without any risk. If, for any reason, a customer is unsatisfied with their purchase within 365-days, they can request a refund.
The money-back-guarantee not only instills trust in the product but also demonstrates the manufacturer's commitment to customer satisfaction. It provides reassurance and peace of mind, allowing users to explore the potential of [Muscle Monster Jack Hammer](https://jackhammer.clubeo.com/page/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html) with confidence, knowing that their investment is protected.
**Where And How To Buy Muscle Monster Jack Hammer?**
----------------------------------------------------
If you want to purchase this effective product at the best price then it is ideal to visit the official website. If you are interested in this formula then it is required to visit the official website and then get this product at a very affordable price. For visiting the website, you can click on any link or the below image on the page! After placing your order, you can get delivery within 5 to 7 business days.
**Final Recap**
---------------
[Muscle Monster Jack Hammer](https://jackhammer.clubeo.com/calendar/2023/09/11/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers) may be new but it is already proving to be one of the most effective natural male enhancement supplements available today.If you want harder, stronger erections, want to last longer in bed, and want the most satisfying sex of your life, then you need to order [Muscle Monster Jack Hammer](https://jackhammer.clubeo.com).To purchase the #1 natural male enhancement product available, you need to _**[Visit the official website of Muscle Monster Jack Hammer and order your bottles today!](https://www.healthsupplement24x7.com/get-jack-hammer)**_
[https://healthsupplements24x7.blogspot.com/2023/09/jack-hammer-libido-booster.html](https://healthsupplements24x7.blogspot.com/2023/09/jack-hammer-libido-booster.html)
[https://pdfhost.io/v/InNJzmxDm\_Muscle\_Monster\_Jack\_Hammer\_Is\_Libido\_Booster\_Performance\_And\_Stamina\_Enhancers](https://pdfhost.io/v/InNJzmxDm_Muscle_Monster_Jack_Hammer_Is_Libido_Booster_Performance_And_Stamina_Enhancers)
[https://jackhammer.clubeo.com](https://jackhammer.clubeo.com)
[https://jackhammer.clubeo.com/calendar/2023/09/11/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers](https://jackhammer.clubeo.com/calendar/2023/09/11/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers)
[https://jackhammer.clubeo.com/page/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html](https://jackhammer.clubeo.com/page/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html)
[https://jackhammer.clubeo.com/page/jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html](https://jackhammer.clubeo.com/page/jack-hammer-is-libido-booster-performance-and-stamina-enhancers.html)
[https://muscle-monster-jack-hammer.hashnode.dev/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers](https://muscle-monster-jack-hammer.hashnode.dev/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers)
[https://www.ivoox.com/muscle-monster-jack-hammer-is-libido-booster-audios-mp3\_rf\_115777891\_1.html](https://www.ivoox.com/muscle-monster-jack-hammer-is-libido-booster-audios-mp3_rf_115777891_1.html)
[https://soundcloud.com/jack-hammer-libido-booster-529116114/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers](https://soundcloud.com/jack-hammer-libido-booster-529116114/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers)
[https://lexcliq.com/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers/](https://lexcliq.com/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers/)
[https://www.sympla.com.br/evento/muscle-monster-jack-hammer---is-libido-booster-performance-and-stamina-enhancers/2154870](https://www.sympla.com.br/evento/muscle-monster-jack-hammer---is-libido-booster-performance-and-stamina-enhancers/2154870)
[https://muscle-monster-jack-hammer-libido-booster.jimdosite.com/](https://muscle-monster-jack-hammer-libido-booster.jimdosite.com/)
[https://colab.research.google.com/drive/1Y6wOFGcw4\_HP768fZGyWNmr25b-3TfcT](https://colab.research.google.com/drive/1Y6wOFGcw4_HP768fZGyWNmr25b-3TfcT)
[https://colab.research.google.com/drive/1iZUrNYuDIHhDyOmd\_z4PqfBF\_gMyInQz](https://colab.research.google.com/drive/1iZUrNYuDIHhDyOmd_z4PqfBF_gMyInQz)
[https://colab.research.google.com/drive/1-9oyMeb8ZQHQPV7\_XVAn7lSH\_hj9qYNL](https://colab.research.google.com/drive/1-9oyMeb8ZQHQPV7_XVAn7lSH_hj9qYNL)
[https://colab.research.google.com/drive/1lUTgzKd90RoS3gZ2XPv3x9lVNM5ZnCNX](https://colab.research.google.com/drive/1lUTgzKd90RoS3gZ2XPv3x9lVNM5ZnCNX)
[https://colab.research.google.com/drive/1uBFdkYyNl9a-LZBfoSjkNc4tR6giacYN](https://colab.research.google.com/drive/1uBFdkYyNl9a-LZBfoSjkNc4tR6giacYN)
[https://form.jotform.com/Jack\_Hammer\_Libido/muscle-monster-jack-hammer](https://form.jotform.com/Jack_Hammer_Libido/muscle-monster-jack-hammer)
[https://events.humanitix.com/muscle-monster-jack-hammer](https://events.humanitix.com/muscle-monster-jack-hammer)
[https://devfolio.co/@mmjackhammer](https://devfolio.co/@mmjackhammer)
[https://devfolio.co/projects/muscle-monster-jack-hammer-8787](https://devfolio.co/projects/muscle-monster-jack-hammer-8787)
[https://evvnt.com/events/?\_evDiscoveryPath=%2Fevent%2F1946574-muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers&touch=1694428037](https://evvnt.com/events/?_evDiscoveryPath=%2Fevent%2F1946574-muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers&touch=1694428037)
[https://www.hitched.co.uk/web/jack-hammer-and-libido-booster](https://www.hitched.co.uk/web/jack-hammer-and-libido-booster)
[https://forums.hitched.co.uk/chat/forums/thread/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers-1119023/](https://forums.hitched.co.uk/chat/forums/thread/muscle-monster-jack-hammer-is-libido-booster-performance-and-stamina-enhancers-1119023/)
[https://community.weddingwire.in/forum/muscle-monster-jack-hammer--t152004](https://community.weddingwire.in/forum/muscle-monster-jack-hammer--t152004)
[https://www.weddingwire.in/web/muscle-monster-and-jack-hammer](https://www.weddingwire.in/web/muscle-monster-and-jack-hammer)
[https://www.weddingwire.com/wedding-forums/muscle-monster-jack-hammer/0b150a852f266feb.html](https://www.weddingwire.com/wedding-forums/muscle-monster-jack-hammer/0b150a852f266feb.html)
[https://www.weddingwire.us/website/jack-hammer-and-libido-booster](https://www.weddingwire.us/website/jack-hammer-and-libido-booster)
[https://groups.google.com/g/muscle-monster-jack-hammer/c/7B08LawiEio](https://groups.google.com/g/muscle-monster-jack-hammer/c/7B08LawiEio)
[https://groups.google.com/g/get-jack-hammer-libido-booster/c/vGeipXxjbDk](https://groups.google.com/g/get-jack-hammer-libido-booster/c/vGeipXxjbDk)
[https://groups.google.com/g/get-jack-hammer-male-enhancement/c/xQnd8\_Q8F5U](https://groups.google.com/g/get-jack-hammer-male-enhancement/c/xQnd8_Q8F5U)
[https://groups.google.com/g/muscle-monster-jack-hammer-libido-booster/c/xSk0TlJquX0](https://groups.google.com/g/muscle-monster-jack-hammer-libido-booster/c/xSk0TlJquX0)
[https://groups.google.com/g/get-jack-hammer/c/9pg4sZPuxHw](https://groups.google.com/g/get-jack-hammer/c/9pg4sZPuxHw) |
hvai/loramore | 2023-09-11T12:42:36.000Z | [
"region:us"
] | hvai | null | null | null | 0 | 0 | Entry not found |
armpower/guanaco-llama2-1k | 2023-09-11T11:12:01.000Z | [
"region:us"
] | armpower | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/shp_with_features_20k_flan_t5_large_sileod | 2023-09-11T11:16:15.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | Entry not found |
saomars/sad | 2023-09-11T11:16:52.000Z | [
"license:c-uda",
"region:us"
] | saomars | null | null | null | 0 | 0 | ---
license: c-uda
---
|
vellorejana/cobol | 2023-09-11T11:20:49.000Z | [
"region:us"
] | vellorejana | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details__fsx_shared-falcon-180B_2100 | 2023-09-11T14:39:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of _fsx_shared-falcon-180B_2100
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [_fsx_shared-falcon-180B_2100](https://huggingface.co/_fsx_shared-falcon-180B_2100)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details__fsx_shared-falcon-180B_2100\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T14:38:41.751680](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_2100/blob/main/results_2023-09-11T14-38-41.751680.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7013964110418803,\n\
\ \"acc_stderr\": 0.030702382053756392,\n \"acc_norm\": 0.7050725401087311,\n\
\ \"acc_norm_stderr\": 0.030672281323978368,\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.4692416686068408,\n\
\ \"mc2_stderr\": 0.014108890624515822\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145677,\n\
\ \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726288\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7061342362079267,\n\
\ \"acc_stderr\": 0.004546002255456772,\n \"acc_norm\": 0.8914558852818164,\n\
\ \"acc_norm_stderr\": 0.003104306434972476\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.033911609343436046,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.033911609343436046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948607,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948607\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534446,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534446\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\
\ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\
\ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607548,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607548\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.022421273612923714,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.022421273612923714\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083025,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083025\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.026841514322958938,\n\
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.026841514322958938\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8935779816513761,\n \"acc_stderr\": 0.013221554674594372,\n \"\
acc_norm\": 0.8935779816513761,\n \"acc_norm_stderr\": 0.013221554674594372\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.03680918141673881,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.03680918141673881\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n\
\ \"acc_stderr\": 0.011622736692041256,\n \"acc_norm\": 0.879948914431673,\n\
\ \"acc_norm_stderr\": 0.011622736692041256\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043114,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5072625698324023,\n\
\ \"acc_stderr\": 0.0167207374051795,\n \"acc_norm\": 0.5072625698324023,\n\
\ \"acc_norm_stderr\": 0.0167207374051795\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\
\ \"acc_stderr\": 0.022827317491059682,\n \"acc_norm\": 0.797427652733119,\n\
\ \"acc_norm_stderr\": 0.022827317491059682\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.021473491834808355,\n\
\ \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.021473491834808355\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5501955671447197,\n\
\ \"acc_stderr\": 0.01270572149856497,\n \"acc_norm\": 0.5501955671447197,\n\
\ \"acc_norm_stderr\": 0.01270572149856497\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789534,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789534\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7532679738562091,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.7532679738562091,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.040693063197213775,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.040693063197213775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896182,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896182\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659386,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659386\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.4692416686068408,\n\
\ \"mc2_stderr\": 0.014108890624515822\n }\n}\n```"
repo_url: https://huggingface.co/_fsx_shared-falcon-180B_2100
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|arc:challenge|25_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hellaswag|10_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-38-41.751680.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-38-41.751680.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T11_31_21.074717
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T11-31-21.074717.parquet'
- split: 2023_09_11T14_38_41.751680
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T14-38-41.751680.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T14-38-41.751680.parquet'
- config_name: results
data_files:
- split: 2023_09_11T11_31_21.074717
path:
- results_2023-09-11T11-31-21.074717.parquet
- split: 2023_09_11T14_38_41.751680
path:
- results_2023-09-11T14-38-41.751680.parquet
- split: latest
path:
- results_2023-09-11T14-38-41.751680.parquet
---
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_2100
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/_fsx_shared-falcon-180B_2100
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_2100](https://huggingface.co/_fsx_shared-falcon-180B_2100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details__fsx_shared-falcon-180B_2100",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T14:38:41.751680](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_2100/blob/main/results_2023-09-11T14-38-41.751680.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7013964110418803,
"acc_stderr": 0.030702382053756392,
"acc_norm": 0.7050725401087311,
"acc_norm_stderr": 0.030672281323978368,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916916,
"mc2": 0.4692416686068408,
"mc2_stderr": 0.014108890624515822
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145677,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726288
},
"harness|hellaswag|10": {
"acc": 0.7061342362079267,
"acc_stderr": 0.004546002255456772,
"acc_norm": 0.8914558852818164,
"acc_norm_stderr": 0.003104306434972476
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.033911609343436046,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.033911609343436046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948607,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948607
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534446,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534446
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5615763546798029,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.5615763546798029,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607548,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607548
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.022421273612923714,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.022421273612923714
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083025,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083025
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.026841514322958938,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.026841514322958938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813902,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813902
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.03680918141673881,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.03680918141673881
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041256,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041256
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.021966309947043114,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.021966309947043114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5072625698324023,
"acc_stderr": 0.0167207374051795,
"acc_norm": 0.5072625698324023,
"acc_norm_stderr": 0.0167207374051795
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059682,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059682
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.021473491834808355,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.021473491834808355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5501955671447197,
"acc_stderr": 0.01270572149856497,
"acc_norm": 0.5501955671447197,
"acc_norm_stderr": 0.01270572149856497
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789534,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789534
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7532679738562091,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.7532679738562091,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.040693063197213775,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.040693063197213775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.027049257915896182,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.027049257915896182
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659386,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659386
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070813,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070813
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916916,
"mc2": 0.4692416686068408,
"mc2_stderr": 0.014108890624515822
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Dolyfin/RVC2Models | 2023-09-11T11:34:05.000Z | [
"license:openrail",
"region:us"
] | Dolyfin | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-13b | 2023-09-11T11:47:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T11:46:04.714895](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-09-11T11-46-04.714895.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4409281791882639,\n\
\ \"acc_stderr\": 0.035246385009241446,\n \"acc_norm\": 0.44455750995634685,\n\
\ \"acc_norm_stderr\": 0.03524275721050058,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4627912956571528,\n\
\ \"mc2_stderr\": 0.01466090570906347\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4129692832764505,\n \"acc_stderr\": 0.014388344935398324,\n\
\ \"acc_norm\": 0.44795221843003413,\n \"acc_norm_stderr\": 0.014532011498211669\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5068711412069309,\n\
\ \"acc_stderr\": 0.0049893102282761136,\n \"acc_norm\": 0.686018721370245,\n\
\ \"acc_norm_stderr\": 0.00463160353975195\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.03692820767264867,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.03692820767264867\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.028156036538233217,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.028156036538233217\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.03553436368828063,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.03553436368828063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5544041450777202,\n \"acc_stderr\": 0.03587014986075659,\n\
\ \"acc_norm\": 0.5544041450777202,\n \"acc_norm_stderr\": 0.03587014986075659\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3230769230769231,\n \"acc_stderr\": 0.023710888501970555,\n\
\ \"acc_norm\": 0.3230769230769231,\n \"acc_norm_stderr\": 0.023710888501970555\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.032061837832361516,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.032061837832361516\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5339449541284403,\n \"acc_stderr\": 0.021387863350353982,\n \"\
acc_norm\": 0.5339449541284403,\n \"acc_norm_stderr\": 0.021387863350353982\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353602,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353602\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470236,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470236\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.043482080516448585,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.043482080516448585\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5402298850574713,\n\
\ \"acc_stderr\": 0.01782199409693354,\n \"acc_norm\": 0.5402298850574713,\n\
\ \"acc_norm_stderr\": 0.01782199409693354\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.430635838150289,\n \"acc_stderr\": 0.02665880027367238,\n\
\ \"acc_norm\": 0.430635838150289,\n \"acc_norm_stderr\": 0.02665880027367238\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.028180596328259283,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.028180596328259283\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.48231511254019294,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.48231511254019294,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.02779476010500874,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.02779476010500874\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n\
\ \"acc_stderr\": 0.012223623364044041,\n \"acc_norm\": 0.35528031290743156,\n\
\ \"acc_norm_stderr\": 0.012223623364044041\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4019607843137255,\n \"acc_stderr\": 0.019835176484375373,\n \
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.019835176484375373\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5522388059701493,\n\
\ \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.5522388059701493,\n\
\ \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4627912956571528,\n\
\ \"mc2_stderr\": 0.01466090570906347\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|arc:challenge|25_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hellaswag|10_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T11-46-04.714895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T11-46-04.714895.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T11-46-04.714895.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T11-46-04.714895.parquet'
- config_name: results
data_files:
- split: 2023_09_11T11_46_04.714895
path:
- results_2023-09-11T11-46-04.714895.parquet
- split: latest
path:
- results_2023-09-11T11-46-04.714895.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T11:46:04.714895](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-09-11T11-46-04.714895.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4409281791882639,
"acc_stderr": 0.035246385009241446,
"acc_norm": 0.44455750995634685,
"acc_norm_stderr": 0.03524275721050058,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4627912956571528,
"mc2_stderr": 0.01466090570906347
},
"harness|arc:challenge|25": {
"acc": 0.4129692832764505,
"acc_stderr": 0.014388344935398324,
"acc_norm": 0.44795221843003413,
"acc_norm_stderr": 0.014532011498211669
},
"harness|hellaswag|10": {
"acc": 0.5068711412069309,
"acc_stderr": 0.0049893102282761136,
"acc_norm": 0.686018721370245,
"acc_norm_stderr": 0.00463160353975195
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.03692820767264867,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.03692820767264867
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.028156036538233217,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.028156036538233217
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5544041450777202,
"acc_stderr": 0.03587014986075659,
"acc_norm": 0.5544041450777202,
"acc_norm_stderr": 0.03587014986075659
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3230769230769231,
"acc_stderr": 0.023710888501970555,
"acc_norm": 0.3230769230769231,
"acc_norm_stderr": 0.023710888501970555
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.032061837832361516,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.032061837832361516
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5339449541284403,
"acc_stderr": 0.021387863350353982,
"acc_norm": 0.5339449541284403,
"acc_norm_stderr": 0.021387863350353982
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353602,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353602
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674118,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674118
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470236,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.043482080516448585,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.043482080516448585
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5402298850574713,
"acc_stderr": 0.01782199409693354,
"acc_norm": 0.5402298850574713,
"acc_norm_stderr": 0.01782199409693354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.430635838150289,
"acc_stderr": 0.02665880027367238,
"acc_norm": 0.430635838150289,
"acc_norm_stderr": 0.02665880027367238
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.028180596328259283,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.028180596328259283
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.48231511254019294,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.48231511254019294,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.02779476010500874,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.02779476010500874
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.012223623364044041,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.012223623364044041
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.019835176484375373,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.019835176484375373
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5522388059701493,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.5522388059701493,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.5029239766081871,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4627912956571528,
"mc2_stderr": 0.01466090570906347
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
devxpy/spotify-genres | 2023-09-11T11:54:43.000Z | [
"region:us"
] | devxpy | null | null | null | 0 | 0 | Spotify genres scraped from https://everynoise.com/everynoise1d.cgi?scope=all
---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: genre_name
dtype: string
- name: genre_slug
dtype: string
- name: playlist_url
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 1047789
num_examples: 6276
download_size: 577290
dataset_size: 1047789
---
# Dataset Card for "spotify-genres"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadinegp/Pharoh | 2023-09-11T12:00:43.000Z | [
"license:openrail",
"region:us"
] | Nadinegp | null | null | null | 0 | 0 | ---
license: openrail
---
|
bellagio-ai/dreambooth-duc-ba-cathedral | 2023-09-11T12:02:18.000Z | [
"region:us"
] | bellagio-ai | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2382062.0
num_examples: 5
download_size: 2383242
dataset_size: 2382062.0
---
# Dataset Card for "dreambooth-duc-ba-cathedral"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Meg98/insuranceqa | 2023-09-11T12:15:56.000Z | [
"region:us"
] | Meg98 | null | null | null | 0 | 0 | |
Divyaraj1/Srk | 2023-09-11T12:39:26.000Z | [
"license:openrail",
"region:us"
] | Divyaraj1 | null | null | null | 0 | 0 | ---
license: openrail
---
|
mrbrain404/venv | 2023-09-11T13:57:30.000Z | [
"license:other",
"region:us"
] | mrbrain404 | null | null | null | 0 | 0 | ---
license: other
---
|
jsrdhher/imatest02 | 2023-10-09T23:46:50.000Z | [
"region:us"
] | jsrdhher | null | null | null | 0 | 0 | Entry not found |
PVIT/pvit_data_stage1 | 2023-09-19T03:54:56.000Z | [
"license:cc-by-nc-4.0",
"arxiv:2308.13437",
"region:us"
] | PVIT | null | null | null | 2 | 0 | ---
license: cc-by-nc-4.0
---
# PVIT dataset
This is the stage 1 pretraining dataset of paper: [Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models](https://arxiv.org/abs/2308.13437).
## Model description
Position-enhanced Visual Instruction Tuning (PVIT) extends the MLLM by incorporating an additional region-level vision encoder to facilitate support for region-based inputs. Specifically, we adopt the vision encoder from RegionCLIP and utilize it to extract region-level features by taking images and regions as inputs. As an additional source of information, the incorporation of region-level features in this way has a minimal impact on the original MLLM. Furthermore, since the features provided by RegionCLIP are themselves already aligned to the language at a fine-grained level, the overhead of aligning it to the MLLM will be relatively small. Following [LLaVA](https://github.com/haotian-liu/LLaVA), we design a two-stage training strategy for PVIT that first pre-training a linear projection to align the region features to the LLM word embedding, followed by end-to-end fine-tuning to follow complex fine-grained instructions.
For more details, please refer to our [paper](https://arxiv.org/abs/2308.13437) and [github repo](https://github.com/THUNLP-MT/PVIT).
## How to use
See [here](https://github.com/THUNLP-MT/PVIT#Train) for instructions of pretraining.
## Intended use
Primary intended uses: The primary use of PVIT is research on large multimodal models and chatbots.
Primary intended users: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
## BibTeX entry and citation info
```bibtex
@misc{chen2023positionenhanced,
title={Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models},
author={Chi Chen and Ruoyu Qin and Fuwen Luo and Xiaoyue Mi and Peng Li and Maosong Sun and Yang Liu},
year={2023},
eprint={2308.13437},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
|
AnimaleMejoraMasculinaVenezuela/AnimaleMaleEnhancementVenezuela | 2023-09-11T13:08:16.000Z | [
"region:us"
] | AnimaleMejoraMasculinaVenezuela | null | null | null | 0 | 0 | <h2><strong>Página oficial de Facebook:</strong></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<h3>✅ Nombre del artículo: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Animale Male Enhancement Venezuela</span></strong></a></h3>
<h3>✅ Beneficios: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Impulso sexual más fuerte</span></strong></a></h3>
<h3>✅ Cantidad: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">30 Pastillas</span></strong></a></h3>
<h3>✅ Clasificación: <strong>★★★★☆ (4.5/5.0)</strong></h3>
<h3>✅ Oferta: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Promesa incondicional de 90 días</span></strong></a></h3>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="font-size: 18px;">Animale Male Enhancement Venezuela</span></strong></a><span style="font-size: 18px;"><strong>,</strong> las píldoras de mejora masculina más potentes del mercado en este momento</span></p>
<p>Como un medio para aumentar su confianza y tener encuentros sexuales más satisfactorios, los hombres dan prioridad al desarrollo de su vida sexual. Sin embargo, muchos hombres recurren a estrategias que son ineficaces e incluso pueden ser perjudiciales para su salud a largo plazo debido a la falta de conocimiento y experiencia. Dado que este es el caso, nos gustaría informarle sobre Animale Male Enhancement Venezuela, las píldoras de mejora masculina más potentes en el mercado en este momento. Estas gomitas se producen a partir de ingredientes totalmente naturales y son seguras para cualquier persona interesada en mejorar su rendimiento sexual.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/SsB5pxY/Animale-Male-Enhancement-Venezuela-Capsules.png" alt="Animale-Male-Enhancement-Venezuela-Capsules" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<p>Además, tienen una selección de bonificaciones que, si se usan, pueden mejorar su masculinidad de varias maneras. Como resultado, no necesita ir más lejos si ha estado buscando un método natural para aumentar su libido y mejorar su desempeño sexual.</p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;"><strong><span style="font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">CONSULTE EL PRECIO DE DESCUENTO DISPONIBLE TOCANDO AQUÍ SITIO WEB OFICIAL</span></strong></span></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Descripción del Producto</span></strong></p>
<p>Las píldoras <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Animale Male Enhancement Venezuela</strong></a><strong> </strong>son una opción popular entre aquellos que desean aumentar su rendimiento sexual. Los caramelos de goma infundidos con son una forma natural de mejorar el rendimiento sexual y la resistencia. Son convenientes ya que puede tomarlos cuando lo desee sin temor a efectos secundarios negativos. La mejor parte es que puedes comenzar a sentir los resultados minutos después de tomarlos.</p>
<p><strong><span style="font-size: 22px;">Los beneficios de usar Animale Male Enhancement Venezuela como un medio para mejorar el rendimiento sexual</span></strong></p>
<p>El potencial de salvar vidas de las tabletas de mejora masculina es sin duda razonable. ¿Qué es más útil que las gomitas de para aumentar la libido? Estas píldoras son efectivas y seguras de usar ya que incluyen Male Enhancement Venezuela, un afrodisíaco natural.</p>
<p>Además, son veganos y no contienen gluten, lo que los convierte en una excelente opción para cualquiera que necesite seguir dietas especiales. Las cápsulas son fáciles de usar sobre la marcha ya que se disuelven rápidamente en la boca. Todo hombre que desee alcanzar su máximo potencial sexual debe incluir gomitas de para potenciar su zona erógena.</p>
<p><strong><span style="font-size: 22px;">Erecciones de mayor potencia</span></strong></p>
<p>Tener relaciones sexuales es placentero, no hay duda al respecto, y es posible divertirse mucho si las cosas van bien. Alternativamente, si tiene problemas en el dormitorio, podría ser una señal de que está luchando con algo más serio, como ansiedad o depresión. Si está buscando una forma natural de mejorar su rendimiento sexual, el puede ser la respuesta que estaba buscando. La evidencia sugiere que el <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Male Enhancement Venezuela</strong></a> aumenta el flujo de sangre al pene, lo que significa que se bombea más sangre al pene durante las relaciones sexuales. También es útil para tratar problemas de salud mental como la ansiedad y la depresión, que pueden tener un impacto negativo en el rendimiento sexual. Como conclusión, las gominolas de se pueden consumir simplemente masticándolas y tragándolas como dulces.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/C6fPph6/Animale-Male-Enhancement-Venezuela.png" alt="Animale-Male-Enhancement-Venezuela" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">VISITA EL SITIO WEB OFICIAL PARA COMPRAR TU BOTELLA CONSIGUELO AHORA</span></strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Una mayor necesidad de tener contacto sexual</span></strong></p>
<p>El impulso de participar en la actividad sexual tiende a aumentar en momentos de estrés o preocupación intensos. Por otro lado, las gomitas de Male Enhancement Venezuela están destinadas a aliviar la angustia de los problemas sexuales al aumentar la libido y el rendimiento en el dormitorio. Lo hacen al reducir los niveles de estrés y ansiedad y aumentar el flujo sanguíneo, lo que resulta en erecciones más poderosas. Las gomitas son perfectas para aliviar el estrés después de un largo día en el trabajo, ya que no solo aumentan la libido sino que también te ayudan a relajarte. Los mejores resultados se pueden lograr simplemente colocándose uno en la boca antes de acostarse. Deberías esperarlo.</p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></strong></a></p>
<p><span style="font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;"><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></span></p>
<p><span style="font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;"><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></span></p>
<p><strong><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><span style="color: #ff0000; font-size: 24px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">➢VISITE EL SITIO WEB OFICIAL PARA COMPRAR HOY OFERTA ESPECIAL!!</span></a></strong></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Sentido de confianza mejorado</span></strong></p>
<p>El <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Male Enhancement Venezuela</strong></a> en su forma más pura y natural se usa para hacer estas delicias del tamaño de un bocado. Son útiles para aumentar la seguridad y la eficiencia sexual. Todo lo que se necesita para recibir un respiro del estrés y las preocupaciones que podría generar es meterse uno en la boca y dejar que suceda la magia. Si está pasando por un momento difícil o simplemente quiere un descanso de sus preocupaciones normales, se sentirá mejor después de unos minutos de este pasatiempo.</p>
<p>Las gomitas de Male Enhancement Venezuela son muy recomendables en tal situación, ya que incluyen ingredientes naturales que estimulan la circulación y la resistencia. Además, no hay ningún peligro al usarlos y no hay consecuencias no deseadas. Dos o tres gomitas consumidas poco antes de acostarse pueden aumentar su libido y mantenerlo fuerte durante toda la noche.</p>
<p><strong><span style="font-size: 22px;">aceite de Male Enhancement Venezuela</span></strong></p>
<p>Si está buscando una forma natural y efectiva de aumentar su rendimiento sexual, las gomitas de son una excelente alternativa para explorar. Estos caramelos son de confianza ya que solo utilizan los mejores ingredientes (aceite de y sabores naturales). Además, todo lo que tienes que hacer para tomar uno es meterlo en tu boca.</p>
<p><strong><span style="font-size: 22px;">L-arginina</span></strong></p>
<p>Las Gomitas de con L-Arginina son un artículo esencial para cualquier régimen de mejora masculina. Es una sustancia nitrogenada que se ha demostrado que aumenta el flujo sanguíneo y mejora la libido. El óxido nítrico, producido cuando se descompone la l-arginina, dilata los vasos sanguíneos y aumenta el deseo sexual de una persona. Se espera que los ingredientes adicionales de estos ositos de goma mejoren su rendimiento sexual.</p>
<p><strong><span style="font-size: 22px;">Fruto de la palma enana americana</span></strong></p>
<p>La baya del árbol de la palma enana americana tiene una larga historia de uso para impulsar la producción natural de testosterona. Además, es un ingrediente clave en las populares Gummies for Male Enhancement, que han demostrado aumentar la libido y el rendimiento masculino en la cama. El Male Enhancement Venezuela es un compuesto que se encuentra en el cannabis. Elevará sus niveles de testosterona y también: Las bayas de la palma enana americana incluyen una variedad de nutrientes clave adicionales, como las vitaminas B6 y B12, que contribuyen al mantenimiento y mejora de la salud y el rendimiento sexual. Estos dulces también incluyen hierba de cabra en celo, un tipo de planta que se ha demostrado que aumenta la libido y la eficacia de un hombre en la cama.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/VvNv7h0/Animale-Male-Enhancement-Venezuela-PIlls.png" alt="Animale-Male-Enhancement-Venezuela-PIlls" border="0" /></a><br /><br /><br /></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">CONSULTE EL PRECIO DE DESCUENTO DISPONIBLE TOCANDO AQUÍ SITIO WEB OFICIAL</span></strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">The Termite, gomitas de mejora masculina Animale</span></strong></p>
<p>Se sabe desde hace siglos que la planta Tribulusterrestris puede mejorar tanto la salud como el rendimiento sexual, razón por la cual se ha incluido en la medicina tradicional china. También se ha demostrado que aumenta los niveles de testosterona, lo que puede explicar por qué algunos hombres reportan una mayor libido y erecciones después de usarlo. Hay una falta de citas para esta sección. Los componentes activos de las gominolas de tribulusterrestris son más eficaces cuando se consumen justo antes de la noche, para que puedas sacarles el máximo partido. Como beneficio adicional, Tribulus Terrestris se puede utilizar como refrigerio nocturno; simplemente disfrútalos sin miedo a las consecuencias negativas.</p>
<p><strong><span style="font-size: 22px;">Eurycoma frondoso de espolón largo</span></strong></p>
<p>¿Busca un enfoque más orgánico para mejorar su rendimiento sexual? Examina la EurycomaLongifolia que forma la base de estos caramelos de . Estas terapias ayudan a aumentar el flujo de sangre al pene, lo que a su vez mejora el rendimiento sexual y el disfrute sexual en general. Debido a la facilidad con la que se pueden consumir, puede tener uno en cualquier momento que elija sin sentir una culpa excesiva.</p>
<p><strong><span style="font-size: 22px;">Con la ayuda de Turbo <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong>Animale Male Enhancement Venezuela</strong></a>, ¿cómo puede mejorar su rendimiento sexual?</span></strong></p>
<p>Una de las ventajas potenciales de las tabletas de mejora masculina es la mejora del rendimiento sexual. Las gomitas de son populares porque son cómodas de usar y tienen pocos efectos secundarios negativos, si es que tienen alguno. Tan pronto como los incluya en su rutina habitual, comenzará a cosechar los beneficios que brindan. Cuando se usan con otros medicamentos para el mejoramiento masculino, como potenciadores de testosterona o medicamentos para la disfunción eréctil, brindan resultados aún mejores. Dime, entonces: ¿qué esperas que suceda? Obtenga algunas gominolas de ahora y vea si funcionan para usted.</p>
<p><strong><span style="font-size: 22px;">Información sobre los ositos de gominola de mejora masculina Animale</span></strong></p>
<p>Tomar en forma de gomitas de mejora masculina es una manera conveniente y fácil de aumentar potencialmente el rendimiento sexual. Son una gran opción para aquellos que no quieren que su medicamento tenga un sabor desagradable ya que no tienen olor ni sabor. Puede estar seguro de que las gomitas son útiles y seguras de tomar, ya que el es un químico natural que se puede encontrar en una variedad de diferentes variedades de cannabis. La evidencia sugiere que el puede ayudar a las personas con problemas de salud sexual, incluida la reducción de la libido o la disfunción eréctil. Por lo tanto, las gominolas de son la mejor opción si buscas un método para mejorar tu rendimiento sexual.</p>
<p><strong><span style="font-size: 22px;">¿Exactamente cómo funciona el realce masculino Animale?</span></strong></p>
<p>Existe una tendencia creciente de usar gominolas de para mejorar el sexo masculino. El es un compuesto que se ha descubierto en el cannabis y se ha relacionado con beneficios terapéuticos positivos en humanos. El se puede encontrar en estos dulces masticables. Implica ayudar a los hombres a superar problemas como la disfunción eréctil (DE) y otros que son exclusivos de ellos. Fáciles de tomar, las gomitas ayudan a los hombres con una variedad de problemas específicos masculinos, incluida la disfunción eréctil (DE). Puede disfrutarlos sin preocuparse por los efectos secundarios negativos o las combinaciones de medicamentos con otras terapias que pueda estar recibiendo, ya que son inofensivos y discretos.</p>
<p><strong><span style="font-size: 22px;">Conclusión</span></strong></p>
<p>¿Está buscando un método para mejorar su rendimiento sexual que sea a la vez natural y muy eficaz? ¡Estás de suerte si ese es el caso! Nuestro personal experto ha llevado a cabo investigaciones y análisis considerables, y han llegado a la conclusión de que Animale Male Enhancement Venezuelaar son las mejores gomitas de del mercado en este momento. Estos dulces gomosos están hechos para ayudar a los hombres de cualquier edad a mejorar su desempeño sexual, y te alegrará saber que también tienen muchos más beneficios. Además, nuestras sencillas instrucciones le mostrarán cómo utilizar Animale Male Enhancement Venezuela en todo su potencial.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/d6fy5cc/Animale-Male-Venezuela.png" alt="Animale-Male-Venezuela" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Nuestros blogs oficiales ⇒</strong></span></h3>
<p><a href="https://animale-male-enhancement-venez-08a14d.webflow.io/"><strong>https://animale-male-enhancement-venez-08a14d.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-92964b.webflow.io/"><strong>https://animale-male-enhancement-ve-92964b.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.webflow.io/"><strong>https://animalemaleenhancementvenezuela.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementve.webflow.io/"><strong>https://animalemaleenhancementve.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.mystrikingly.com/"><strong>https://animale-male-enhancement-venezuela.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.mystrikingly.com/"><strong>https://animale-male-enhancement-ve.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.mystrikingly.com/"><strong>https://animalemaleenhancementvenezuela.mystrikingly.com/</strong></a></p>
<p><a href="http://animalemaleenhancementve.mystrikingly.com/"><strong>http://animalemaleenhancementve.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela-1.jimdosite.com/"><strong>https://animale-male-enhancement-venezuela-1.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-1.jimdosite.com/"><strong>https://animale-male-enhancement-ve-1.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jimdosite.com/"><strong>https://animalemaleenhancementvenezuela.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jimdosite.com/"><strong>https://animalemaleenhancementve.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venez.godaddysites.com/"><strong>https://animale-male-enhancement-venez.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.godaddysites.com/"><strong>https://animale-male-enhancement-ve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezue1.godaddysites.com/"><strong>https://animalemaleenhancementvenezue1.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.godaddysites.com/"><strong>https://animalemaleenhancementve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancement-venezuela.jigsy.com/"><strong>https://animalemaleenhancement-venezuela.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.jigsy.com/"><strong>https://animale-male-enhancement-ve.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jigsy.com/"><strong>https://animalemaleenhancementvenezuela.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jigsy.com/"><strong>https://animalemaleenhancementve.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.company.site/"><strong>https://animale-male-enhancement-venezuela.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.company.site/"><strong>https://animale-male-enhancement-ve.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementinvenezuela.company.site/"><strong>https://animalemaleenhancementinvenezuela.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementve.company.site/"><strong>https://animalemaleenhancementve.company.site/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-ve/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-in-ve/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD"><strong>https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD"><strong>https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement New Zealand Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOfNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOfNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Búsquedas recientes :</span></strong></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVE</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVEShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVEOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVELegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuela</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuelaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaLegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaScam</strong></a></p> |
AnimaleMejoraMasculinaVenezuela/AnimaleMaleEnhancementVE | 2023-09-11T13:08:57.000Z | [
"region:us"
] | AnimaleMejoraMasculinaVenezuela | null | null | null | 0 | 0 | <h2><strong>Página oficial de Facebook:</strong></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<h3>✅ Nombre del artículo: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Animale Male Enhancement Venezuela</span></strong></a></h3>
<h3>✅ Beneficios: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Impulso sexual más fuerte</span></strong></a></h3>
<h3>✅ Cantidad: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">30 Pastillas</span></strong></a></h3>
<h3>✅ Clasificación: <strong>★★★★☆ (4.5/5.0)</strong></h3>
<h3>✅ Oferta: <a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #b8312f;">Promesa incondicional de 90 días</span></strong></a></h3>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="font-size: 18px;">Animale Male Enhancement Venezuela</span></strong></a><span style="font-size: 18px;"><strong>,</strong> las píldoras de mejora masculina más potentes del mercado en este momento</span></p>
<p>Como un medio para aumentar su confianza y tener encuentros sexuales más satisfactorios, los hombres dan prioridad al desarrollo de su vida sexual. Sin embargo, muchos hombres recurren a estrategias que son ineficaces e incluso pueden ser perjudiciales para su salud a largo plazo debido a la falta de conocimiento y experiencia. Dado que este es el caso, nos gustaría informarle sobre Animale Male Enhancement Venezuela, las píldoras de mejora masculina más potentes en el mercado en este momento. Estas gomitas se producen a partir de ingredientes totalmente naturales y son seguras para cualquier persona interesada en mejorar su rendimiento sexual.</p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><img src="https://i.ibb.co/SsB5pxY/Animale-Male-Enhancement-Venezuela-Capsules.png" alt="Animale-Male-Enhancement-Venezuela-Capsules" border="0" /></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p><a href="https://healthcare24hrs.com/animale-male-enhancement-venezuela-jt"><strong><span style="color: #ff0000; font-size: 26px; text-shadow: rgba(255, 0, 0, 0.8) 3px 3px 20px;">DEBE VER: HAGA CLIC AQUÍ PARA COMPRAR Y OBTENER 70% DE DESCUENTO EN VENEZUELA</span></strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Nuestros blogs oficiales ⇒</strong></span></h3>
<p><a href="https://animale-male-enhancement-venez-08a14d.webflow.io/"><strong>https://animale-male-enhancement-venez-08a14d.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-92964b.webflow.io/"><strong>https://animale-male-enhancement-ve-92964b.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.webflow.io/"><strong>https://animalemaleenhancementvenezuela.webflow.io/</strong></a></p>
<p><a href="https://animalemaleenhancementve.webflow.io/"><strong>https://animalemaleenhancementve.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.mystrikingly.com/"><strong>https://animale-male-enhancement-venezuela.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.mystrikingly.com/"><strong>https://animale-male-enhancement-ve.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.mystrikingly.com/"><strong>https://animalemaleenhancementvenezuela.mystrikingly.com/</strong></a></p>
<p><a href="http://animalemaleenhancementve.mystrikingly.com/"><strong>http://animalemaleenhancementve.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela-1.jimdosite.com/"><strong>https://animale-male-enhancement-venezuela-1.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve-1.jimdosite.com/"><strong>https://animale-male-enhancement-ve-1.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jimdosite.com/"><strong>https://animalemaleenhancementvenezuela.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jimdosite.com/"><strong>https://animalemaleenhancementve.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venez.godaddysites.com/"><strong>https://animale-male-enhancement-venez.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.godaddysites.com/"><strong>https://animale-male-enhancement-ve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezue1.godaddysites.com/"><strong>https://animalemaleenhancementvenezue1.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.godaddysites.com/"><strong>https://animalemaleenhancementve.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancement-venezuela.jigsy.com/"><strong>https://animalemaleenhancement-venezuela.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.jigsy.com/"><strong>https://animale-male-enhancement-ve.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementvenezuela.jigsy.com/"><strong>https://animalemaleenhancementvenezuela.jigsy.com/</strong></a></p>
<p><a href="https://animalemaleenhancementve.jigsy.com/"><strong>https://animalemaleenhancementve.jigsy.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-venezuela.company.site/"><strong>https://animale-male-enhancement-venezuela.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-ve.company.site/"><strong>https://animale-male-enhancement-ve.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementinvenezuela.company.site/"><strong>https://animalemaleenhancementinvenezuela.company.site/</strong></a></p>
<p><a href="https://animalemaleenhancementve.company.site/"><strong>https://animalemaleenhancementve.company.site/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-ve/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-ve/"><strong>https://sites.google.com/view/animale-male-enhancement-in-ve/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/vojIXk_VbA4</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/lX6zqO5b_jM</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/6KTlPYQ-pdI</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-venezuela-buy/c/l-N-_2YoGVw</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD"><strong>https://lookerstudio.google.com/u/0/reporting/399a76e1-6ad8-466e-82f6-d741df344a3c/page/fqHFD</strong></a></p>
<p><a href="https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD"><strong>https://lookerstudio.google.com/u/0/reporting/c0ee559d-067d-40b4-b3cc-e274756b2125/page/ngDFD</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Animale Male Enhancement New Zealand Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOfNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOfNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNZ/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNZ/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementOFNewZealand/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAtNewZealand/</strong></a></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies Official Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p> </p>
<p><strong><span style="font-size: 22px;">Búsquedas recientes :</span></strong></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVE</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVEShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVEOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVEIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVEPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVEDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVELegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVEScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuela</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPrice</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaShop</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>#AnimaleMaleEnhancementVenezuelaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>#AnimaleMaleEnhancementVenezuelaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>#AnimaleMaleEnhancementVenezuelaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>#AnimaleMaleEnhancementVenezuelaLegit</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>#AnimaleMaleEnhancementVenezuelaScam</strong></a></p> |
osbm/emoji | 2023-09-11T13:15:46.000Z | [
"region:us"
] | osbm | null | null | null | 0 | 0 | Entry not found |
ceskayaka/surrender | 2023-09-11T14:11:01.000Z | [
"license:other",
"region:us"
] | ceskayaka | null | null | null | 0 | 0 | ---
license: other
---
|
bongo2112/facechain-models-full | 2023-09-11T13:53:07.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
PVIT/pvit_data_stage2 | 2023-09-19T03:55:19.000Z | [
"license:cc-by-nc-4.0",
"arxiv:2308.13437",
"region:us"
] | PVIT | null | null | null | 1 | 0 | ---
license: cc-by-nc-4.0
---
# PVIT dataset
This is the stage 2 pretraining dataset of paper: [Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models](https://arxiv.org/abs/2308.13437).
## Model description
Position-enhanced Visual Instruction Tuning (PVIT) extends the MLLM by incorporating an additional region-level vision encoder to facilitate support for region-based inputs. Specifically, we adopt the vision encoder from RegionCLIP and utilize it to extract region-level features by taking images and regions as inputs. As an additional source of information, the incorporation of region-level features in this way has a minimal impact on the original MLLM. Furthermore, since the features provided by RegionCLIP are themselves already aligned to the language at a fine-grained level, the overhead of aligning it to the MLLM will be relatively small. Following [LLaVA](https://github.com/haotian-liu/LLaVA), we design a two-stage training strategy for PVIT that first pre-training a linear projection to align the region features to the LLM word embedding, followed by end-to-end fine-tuning to follow complex fine-grained instructions.
For more details, please refer to our [paper](https://arxiv.org/abs/2308.13437) and [github repo](https://github.com/THUNLP-MT/PVIT).
## How to use
See [here](https://github.com/THUNLP-MT/PVIT#Train) for instructions of pretraining.
## Intended use
Primary intended uses: The primary use of PVIT is research on large multimodal models and chatbots.
Primary intended users: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
## BibTeX entry and citation info
```bibtex
@misc{chen2023positionenhanced,
title={Position-Enhanced Visual Instruction Tuning for Multimodal Large Language Models},
author={Chi Chen and Ruoyu Qin and Fuwen Luo and Xiaoyue Mi and Peng Li and Maosong Sun and Yang Liu},
year={2023},
eprint={2308.13437},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
davanstrien/model-sizer-bot-stats | 2023-09-11T14:01:59.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: createdAt
dtype: timestamp[us]
- name: pr_number
dtype: int64
- name: status
dtype: large_string
- name: repo_id
dtype: large_string
- name: type
dtype: large_string
- name: isPullRequest
dtype: bool
splits:
- name: train
num_bytes: 3465
num_examples: 44
download_size: 0
dataset_size: 3465
---
# Dataset Card for "model-sizer-bot-stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/librarian-bot-stats | 2023-09-11T16:49:25.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: createdAt
dtype: timestamp[us]
- name: pr_number
dtype: int64
- name: status
dtype: large_string
- name: repo_id
dtype: large_string
- name: type
dtype: large_string
- name: isPullRequest
dtype: bool
splits:
- name: train
num_bytes: 297708
num_examples: 3416
download_size: 123005
dataset_size: 297708
---
# Dataset Card for "librarian-bot-stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhaoyang9425/NoisyLibriSpeechDataset-MUSAN | 2023-09-14T12:29:19.000Z | [
"language:en",
"license:afl-3.0",
"read book",
"region:us"
] | zhaoyang9425 | null | null | null | 0 | 0 | ---
license: afl-3.0
task_categories:
- noisy_speech_recognition
language:
- en
tags:
- read book
pretty_name: NoisyLibriSpeech_MUSAN
---
# Dataset Card for the Noisy LibriSpeech dataset
## Dataset Description
- **Homepage:** Coming Soon
- **Repository:** https://huggingface.co/datasets/zhaoyang9425/NoisyLibriSpeechDataset-MUSAN
- **Paper:** Coming Soon
=- **Point of Contact:** zhaoyang9425@gmail.com
### Dataset Summary
The noisy speech corpus is constructed by randomly sampling noise clips from the MUSAN noise dataset and adding them to LibriSpeech dataset.
The Signal-to-Noise Ratio (SNR) levels are sampled from a uniform distribution in 0 dB, 5 dB, 10 dB, 15 dB, and 20 dB.
## Dataset Structure
same structure with LibriSpeech dataset
|
Pielgrin/omajis | 2023-09-13T15:06:15.000Z | [
"region:us"
] | Pielgrin | null | null | null | 0 | 0 | Entry not found |
p1atdev/JEDHRI | 2023-09-11T15:05:47.000Z | [
"size_categories:n<1K",
"language:ja",
"license:cc-by-4.0",
"legal",
"not-for-all-audiences",
"region:us"
] | p1atdev | Japanese Expressions Dataset from Human Rights Infringement on Internet | @dataset{hisada_shohei_2023_7960519,
author = {HISADA, Shohei},
title = {{Japanese Expressions Dataset from Human Rights
Infringement on Internet}},
month = jun,
year = 2023,
publisher = {Zenodo},
version = {0.2},
doi = {10.5281/zenodo.7960519},
url = {https://doi.org/10.5281/zenodo.7960519}
} | null | 0 | 0 | ---
license: cc-by-4.0
language:
- ja
tags:
- legal
- not-for-all-audiences
size_categories:
- n<1K
---
### Japanese Expressions Dataset from Human Rights Infringement on Internet
[権利侵害と不快さの間:日本語人権侵害表現データセット](https://zenodo.org/record/7960519) を HuggingFace datasets 向けに改変。 |
dot-ammar/AR-dotless-medium | 2023-09-11T15:37:58.000Z | [
"region:us"
] | dot-ammar | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: clean
dtype: string
- name: dotless
dtype: string
splits:
- name: train
num_bytes: 400580815.31144565
num_examples: 2274050
download_size: 228315577
dataset_size: 400580815.31144565
---
# Dataset Card for "AR-dotless-medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
radioworkflow/audio | 2023-09-11T15:18:32.000Z | [
"region:us"
] | radioworkflow | null | null | null | 0 | 0 | Entry not found |
namngo/dsadd | 2023-09-11T15:17:46.000Z | [
"region:us"
] | namngo | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__megatron-gpt2-345m-evol_instruct_v2 | 2023-09-11T15:26:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2](https://huggingface.co/KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__megatron-gpt2-345m-evol_instruct_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T15:25:29.306947](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__megatron-gpt2-345m-evol_instruct_v2/blob/main/results_2023-09-11T15-25-29.306947.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23745408525824543,\n\
\ \"acc_stderr\": 0.030774008630701014,\n \"acc_norm\": 0.23896085913917708,\n\
\ \"acc_norm_stderr\": 0.030786738280539294,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4118877420447148,\n\
\ \"mc2_stderr\": 0.014815588726198047\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2295221843003413,\n \"acc_stderr\": 0.012288926760890792,\n\
\ \"acc_norm\": 0.2636518771331058,\n \"acc_norm_stderr\": 0.012875929151297061\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3291177056363274,\n\
\ \"acc_stderr\": 0.004689324696186867,\n \"acc_norm\": 0.38388767177853017,\n\
\ \"acc_norm_stderr\": 0.00485337164623925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756193,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756193\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674474,\n\
\ \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674474\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843669,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843669\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.2161290322580645,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860667,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860667\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.02235219373745327,\n \
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.02235219373745327\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145692,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145692\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.18487394957983194,\n \"acc_stderr\": 0.025215992877954202,\n\
\ \"acc_norm\": 0.18487394957983194,\n \"acc_norm_stderr\": 0.025215992877954202\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803046,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803046\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431177,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431177\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.015329888940899865,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.015329888940899865\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.022827317491059675,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.022827317491059675\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658537,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658537\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279336,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279336\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.02599111767281329,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.02599111767281329\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4118877420447148,\n\
\ \"mc2_stderr\": 0.014815588726198047\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-25-29.306947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-25-29.306947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-25-29.306947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-25-29.306947.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_25_29.306947
path:
- results_2023-09-11T15-25-29.306947.parquet
- split: latest
path:
- results_2023-09-11T15-25-29.306947.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2](https://huggingface.co/KnutJaegersberg/megatron-gpt2-345m-evol_instruct_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__megatron-gpt2-345m-evol_instruct_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T15:25:29.306947](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__megatron-gpt2-345m-evol_instruct_v2/blob/main/results_2023-09-11T15-25-29.306947.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23745408525824543,
"acc_stderr": 0.030774008630701014,
"acc_norm": 0.23896085913917708,
"acc_norm_stderr": 0.030786738280539294,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4118877420447148,
"mc2_stderr": 0.014815588726198047
},
"harness|arc:challenge|25": {
"acc": 0.2295221843003413,
"acc_stderr": 0.012288926760890792,
"acc_norm": 0.2636518771331058,
"acc_norm_stderr": 0.012875929151297061
},
"harness|hellaswag|10": {
"acc": 0.3291177056363274,
"acc_stderr": 0.004689324696186867,
"acc_norm": 0.38388767177853017,
"acc_norm_stderr": 0.00485337164623925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674474,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674474
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843669,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843669
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860667,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860667
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.02235219373745327,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.02235219373745327
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145692,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145692
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18487394957983194,
"acc_stderr": 0.025215992877954202,
"acc_norm": 0.18487394957983194,
"acc_norm_stderr": 0.025215992877954202
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803046,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803046
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431177,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431177
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.015329888940899865,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.015329888940899865
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.022827317491059675,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.022827317491059675
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729903,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729903
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279336,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.02599111767281329,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.02599111767281329
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4118877420447148,
"mc2_stderr": 0.014815588726198047
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain | 2023-09-11T15:30:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-7b-pretrain
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-7b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T15:29:37.507273](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain/blob/main/results_2023-09-11T15-29-37.507273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.413730135636818,\n\
\ \"acc_stderr\": 0.03536839698284917,\n \"acc_norm\": 0.4174258273603177,\n\
\ \"acc_norm_stderr\": 0.035358486112379406,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.39076760896589485,\n\
\ \"mc2_stderr\": 0.014445824494340054\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4598976109215017,\n \"acc_stderr\": 0.014564318856924848,\n\
\ \"acc_norm\": 0.4863481228668942,\n \"acc_norm_stderr\": 0.014605943429860947\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5566620195180243,\n\
\ \"acc_stderr\": 0.0049576376484264705,\n \"acc_norm\": 0.7482573192591118,\n\
\ \"acc_norm_stderr\": 0.0043312717177738545\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205615,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205615\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853442,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853442\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.04113914981189261,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.04113914981189261\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.43870967741935485,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.43870967741935485,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.46464646464646464,\n \"acc_stderr\": 0.035534363688280626,\n \"\
acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.035534363688280626\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n\
\ \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3974358974358974,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.3974358974358974,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5541284403669725,\n \"acc_stderr\": 0.021311335009708575,\n \"\
acc_norm\": 0.5541284403669725,\n \"acc_norm_stderr\": 0.021311335009708575\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \
\ \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4125560538116592,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.4125560538116592,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5454545454545454,\n \"acc_stderr\": 0.04545454545454548,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04545454545454548\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6025641025641025,\n\
\ \"acc_stderr\": 0.03205953453789293,\n \"acc_norm\": 0.6025641025641025,\n\
\ \"acc_norm_stderr\": 0.03205953453789293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.5402298850574713,\n \"acc_stderr\": 0.01782199409693354,\n\
\ \"acc_norm\": 0.5402298850574713,\n \"acc_norm_stderr\": 0.01782199409693354\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.026296227915613677,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.026296227915613677\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.014912413096372434,\n\
\ \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.014912413096372434\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4803921568627451,\n\
\ \"acc_stderr\": 0.028607893699576063,\n \"acc_norm\": 0.4803921568627451,\n\
\ \"acc_norm_stderr\": 0.028607893699576063\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.4919614147909968,\n \"acc_stderr\": 0.028394421370984545,\n\
\ \"acc_norm\": 0.4919614147909968,\n \"acc_norm_stderr\": 0.028394421370984545\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3950617283950617,\n\
\ \"acc_stderr\": 0.027201117666925657,\n \"acc_norm\": 0.3950617283950617,\n\
\ \"acc_norm_stderr\": 0.027201117666925657\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.3262411347517731,\n \"acc_stderr\": 0.02796845304356317,\n\
\ \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.02796845304356317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3057366362451108,\n\
\ \"acc_stderr\": 0.011766973847072912,\n \"acc_norm\": 0.3057366362451108,\n\
\ \"acc_norm_stderr\": 0.011766973847072912\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.40032679738562094,\n \"acc_stderr\": 0.01982184368827177,\n \
\ \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.01982184368827177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3551020408163265,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.3551020408163265,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n\
\ \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.5472636815920398,\n\
\ \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529916,\n\
\ \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529916\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.39076760896589485,\n\
\ \"mc2_stderr\": 0.014445824494340054\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-29-37.507273.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- results_2023-09-11T15-29-37.507273.parquet
- split: latest
path:
- results_2023-09-11T15-29-37.507273.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-pretrain
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-7b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T15:29:37.507273](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain/blob/main/results_2023-09-11T15-29-37.507273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.413730135636818,
"acc_stderr": 0.03536839698284917,
"acc_norm": 0.4174258273603177,
"acc_norm_stderr": 0.035358486112379406,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.39076760896589485,
"mc2_stderr": 0.014445824494340054
},
"harness|arc:challenge|25": {
"acc": 0.4598976109215017,
"acc_stderr": 0.014564318856924848,
"acc_norm": 0.4863481228668942,
"acc_norm_stderr": 0.014605943429860947
},
"harness|hellaswag|10": {
"acc": 0.5566620195180243,
"acc_stderr": 0.0049576376484264705,
"acc_norm": 0.7482573192591118,
"acc_norm_stderr": 0.0043312717177738545
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205615,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205615
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853442,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853442
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43870967741935485,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.43870967741935485,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.035534363688280626,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.035534363688280626
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5541284403669725,
"acc_stderr": 0.021311335009708575,
"acc_norm": 0.5541284403669725,
"acc_norm_stderr": 0.021311335009708575
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4430379746835443,
"acc_stderr": 0.03233532777533484,
"acc_norm": 0.4430379746835443,
"acc_norm_stderr": 0.03233532777533484
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4125560538116592,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.4125560538116592,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04545454545454548,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04545454545454548
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.03205953453789293,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.03205953453789293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5402298850574713,
"acc_stderr": 0.01782199409693354,
"acc_norm": 0.5402298850574713,
"acc_norm_stderr": 0.01782199409693354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.026296227915613677,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.026296227915613677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576063,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576063
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3950617283950617,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.3950617283950617,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.02796845304356317,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.02796845304356317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3057366362451108,
"acc_stderr": 0.011766973847072912,
"acc_norm": 0.3057366362451108,
"acc_norm_stderr": 0.011766973847072912
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40032679738562094,
"acc_stderr": 0.01982184368827177,
"acc_norm": 0.40032679738562094,
"acc_norm_stderr": 0.01982184368827177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3551020408163265,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.3551020408163265,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.035197027175769155,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.035197027175769155
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529916,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529916
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.39076760896589485,
"mc2_stderr": 0.014445824494340054
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Shivansh2310/trinity-dolly-10k | 2023-09-11T15:36:57.000Z | [
"region:us"
] | Shivansh2310 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 16392818
num_examples: 10000
download_size: 10078470
dataset_size: 16392818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "trinity-dolly-10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/line_art_drawing_prompts | 2023-09-11T15:43:10.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1552162
num_examples: 10000
download_size: 216025
dataset_size: 1552162
---
# Dataset Card for "line_art_drawing_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-13B | 2023-09-18T15:06:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of AIDC-ai-business/Marcoroni-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AIDC-ai-business/Marcoroni-13B](https://huggingface.co/AIDC-ai-business/Marcoroni-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T15:05:14.072037](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-13B/blob/main/results_2023-09-18T15-05-14.072037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5968939242056442,\n\
\ \"acc_stderr\": 0.03397009205870784,\n \"acc_norm\": 0.6007957237246586,\n\
\ \"acc_norm_stderr\": 0.033948145854358645,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5769635027861147,\n\
\ \"mc2_stderr\": 0.015727623906231773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472447,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111726\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6366261700856403,\n\
\ \"acc_stderr\": 0.004799882248494813,\n \"acc_norm\": 0.8327026488747261,\n\
\ \"acc_norm_stderr\": 0.003724783389253322\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552742,\n \"\
acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552742\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.01743793717334323,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.01743793717334323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455333,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455333\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884865,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884865\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594207,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594207\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n\
\ \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n\
\ \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889136,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125468,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125468\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786685,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786685\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982055,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982055\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5769635027861147,\n\
\ \"mc2_stderr\": 0.015727623906231773\n }\n}\n```"
repo_url: https://huggingface.co/AIDC-ai-business/Marcoroni-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-45-30.030837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-05-14.072037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-05-14.072037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-45-30.030837.parquet'
- split: 2023_09_18T15_05_14.072037
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-05-14.072037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T15-05-14.072037.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_45_30.030837
path:
- results_2023-09-11T15-45-30.030837.parquet
- split: 2023_09_18T15_05_14.072037
path:
- results_2023-09-18T15-05-14.072037.parquet
- split: latest
path:
- results_2023-09-18T15-05-14.072037.parquet
---
# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AIDC-ai-business/Marcoroni-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-13B](https://huggingface.co/AIDC-ai-business/Marcoroni-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T15:05:14.072037](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-13B/blob/main/results_2023-09-18T15-05-14.072037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5968939242056442,
"acc_stderr": 0.03397009205870784,
"acc_norm": 0.6007957237246586,
"acc_norm_stderr": 0.033948145854358645,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5769635027861147,
"mc2_stderr": 0.015727623906231773
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472447,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111726
},
"harness|hellaswag|10": {
"acc": 0.6366261700856403,
"acc_stderr": 0.004799882248494813,
"acc_norm": 0.8327026488747261,
"acc_norm_stderr": 0.003724783389253322
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.01743793717334323,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.01743793717334323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455333,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455333
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884865,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884865
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594207,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594207
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889136,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125468,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125468
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786685,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786685
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5769635027861147,
"mc2_stderr": 0.015727623906231773
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ancss/QSJ_dataset | 2023-09-13T07:39:50.000Z | [
"license:mit",
"region:us"
] | ancss | null | null | null | 0 | 0 | ---
license: mit
---
枪神纪吧20年以来的数据由chatgpt3.5生成,并由人工从20万条精简到5万条,预计还可以精简到3万条,但懒得看了
ernie_dataset.jsonl 是文心一言 含排序的数据集,计算问题相似性 阈值0.7,相似的问题进行组合 |
open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus | 2023-09-11T15:54:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/gpt2_camel_physics-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2_camel_physics-platypus](https://huggingface.co/lgaalves/gpt2_camel_physics-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T15:53:04.413591](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus/blob/main/results_2023-09-11T15-53-04.413591.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2683248305375654,\n\
\ \"acc_stderr\": 0.03193677298130021,\n \"acc_norm\": 0.2692355712851633,\n\
\ \"acc_norm_stderr\": 0.0319495253666372,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.3955559845281961,\n\
\ \"mc2_stderr\": 0.014839540193741688\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19795221843003413,\n \"acc_stderr\": 0.011643990971573405,\n\
\ \"acc_norm\": 0.23037542662116042,\n \"acc_norm_stderr\": 0.01230492841874761\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29187412865962953,\n\
\ \"acc_stderr\": 0.004536955796510544,\n \"acc_norm\": 0.31318462457677754,\n\
\ \"acc_norm_stderr\": 0.0046284090842187535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566015,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566015\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.2870967741935484,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633507,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715473,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715473\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20306513409961685,\n\
\ \"acc_stderr\": 0.014385525076611578,\n \"acc_norm\": 0.20306513409961685,\n\
\ \"acc_norm_stderr\": 0.014385525076611578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21098265895953758,\n \"acc_stderr\": 0.021966309947043124,\n\
\ \"acc_norm\": 0.21098265895953758,\n \"acc_norm_stderr\": 0.021966309947043124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045514,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528023,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528023\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.3955559845281961,\n\
\ \"mc2_stderr\": 0.014839540193741688\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2_camel_physics-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-53-04.413591.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-53-04.413591.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-53-04.413591.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_53_04.413591
path:
- results_2023-09-11T15-53-04.413591.parquet
- split: latest
path:
- results_2023-09-11T15-53-04.413591.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2_camel_physics-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_camel_physics-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_camel_physics-platypus](https://huggingface.co/lgaalves/gpt2_camel_physics-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T15:53:04.413591](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_camel_physics-platypus/blob/main/results_2023-09-11T15-53-04.413591.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2683248305375654,
"acc_stderr": 0.03193677298130021,
"acc_norm": 0.2692355712851633,
"acc_norm_stderr": 0.0319495253666372,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3955559845281961,
"mc2_stderr": 0.014839540193741688
},
"harness|arc:challenge|25": {
"acc": 0.19795221843003413,
"acc_stderr": 0.011643990971573405,
"acc_norm": 0.23037542662116042,
"acc_norm_stderr": 0.01230492841874761
},
"harness|hellaswag|10": {
"acc": 0.29187412865962953,
"acc_stderr": 0.004536955796510544,
"acc_norm": 0.31318462457677754,
"acc_norm_stderr": 0.0046284090842187535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566015,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566015
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715473,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715473
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20306513409961685,
"acc_stderr": 0.014385525076611578,
"acc_norm": 0.20306513409961685,
"acc_norm_stderr": 0.014385525076611578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21098265895953758,
"acc_stderr": 0.021966309947043124,
"acc_norm": 0.21098265895953758,
"acc_norm_stderr": 0.021966309947043124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045514,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528023,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528023
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.3955559845281961,
"mc2_stderr": 0.014839540193741688
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
khaled123/8 | 2023-09-11T15:59:35.000Z | [
"region:us"
] | khaled123 | null | null | null | 0 | 0 | Entry not found |
YXStableDiffusion/CTNet | 2023-09-11T16:10:16.000Z | [
"region:us"
] | YXStableDiffusion | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-pretrain | 2023-09-11T16:10:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-13b-pretrain
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-13b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-13b-pretrain)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-pretrain\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T16:09:00.658603](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-pretrain/blob/main/results_2023-09-11T16-09-00.658603.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.513550506958836,\n\
\ \"acc_stderr\": 0.03482260410140431,\n \"acc_norm\": 0.5176591429765649,\n\
\ \"acc_norm_stderr\": 0.03480741159130236,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826838,\n \"mc2\": 0.3623901687630447,\n\
\ \"mc2_stderr\": 0.014311042193634166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.014611305705056995,\n\
\ \"acc_norm\": 0.5392491467576792,\n \"acc_norm_stderr\": 0.014566303676636584\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.589523999203346,\n\
\ \"acc_stderr\": 0.004909148239488273,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.004057792171893571\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641092,\n\
\ \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641092\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916644,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6788990825688074,\n \"acc_stderr\": 0.020018149772733747,\n \"\
acc_norm\": 0.6788990825688074,\n \"acc_norm_stderr\": 0.020018149772733747\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105293,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7139208173690932,\n\
\ \"acc_stderr\": 0.016160871405127543,\n \"acc_norm\": 0.7139208173690932,\n\
\ \"acc_norm_stderr\": 0.016160871405127543\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791448,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791448\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n\
\ \"acc_stderr\": 0.012172035157127115,\n \"acc_norm\": 0.34876140808344197,\n\
\ \"acc_norm_stderr\": 0.012172035157127115\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5081699346405228,\n \"acc_stderr\": 0.020225134343057265,\n \
\ \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.020225134343057265\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512697,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512697\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987251,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987251\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826838,\n \"mc2\": 0.3623901687630447,\n\
\ \"mc2_stderr\": 0.014311042193634166\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-13b-pretrain
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-09-00.658603.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-09-00.658603.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-09-00.658603.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-09-00.658603.parquet'
- config_name: results
data_files:
- split: 2023_09_11T16_09_00.658603
path:
- results_2023-09-11T16-09-00.658603.parquet
- split: latest
path:
- results_2023-09-11T16-09-00.658603.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-13b-pretrain
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-13b-pretrain
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-13b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-13b-pretrain) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-pretrain",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T16:09:00.658603](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-pretrain/blob/main/results_2023-09-11T16-09-00.658603.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.513550506958836,
"acc_stderr": 0.03482260410140431,
"acc_norm": 0.5176591429765649,
"acc_norm_stderr": 0.03480741159130236,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826838,
"mc2": 0.3623901687630447,
"mc2_stderr": 0.014311042193634166
},
"harness|arc:challenge|25": {
"acc": 0.49829351535836175,
"acc_stderr": 0.014611305705056995,
"acc_norm": 0.5392491467576792,
"acc_norm_stderr": 0.014566303676636584
},
"harness|hellaswag|10": {
"acc": 0.589523999203346,
"acc_stderr": 0.004909148239488273,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.004057792171893571
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641092,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641092
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916644,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6788990825688074,
"acc_stderr": 0.020018149772733747,
"acc_norm": 0.6788990825688074,
"acc_norm_stderr": 0.020018149772733747
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105293,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7139208173690932,
"acc_stderr": 0.016160871405127543,
"acc_norm": 0.7139208173690932,
"acc_norm_stderr": 0.016160871405127543
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.028275490156791448,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.028275490156791448
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34876140808344197,
"acc_stderr": 0.012172035157127115,
"acc_norm": 0.34876140808344197,
"acc_norm_stderr": 0.012172035157127115
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5081699346405228,
"acc_stderr": 0.020225134343057265,
"acc_norm": 0.5081699346405228,
"acc_norm_stderr": 0.020225134343057265
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512697,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512697
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987251,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987251
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826838,
"mc2": 0.3623901687630447,
"mc2_stderr": 0.014311042193634166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hmxiong/ScanNet-Detection-Instruction | 2023-10-10T11:11:06.000Z | [
"region:us"
] | hmxiong | null | null | null | 0 | 0 | # V0
使用的数据为直接将所有的bbox编码为一句话然后送进去LLM,需要模型根据输入来直接回归出所有的数字
# V1
V0基础上加入了类别提示
# V2/V2_normalized
使用的数据为对应的类别和bbox,但是没有直接变为token,而是需要在程序内部将bbox坐标编码为special token作为回归的对象
# V3_normalized
在V2_normalized数据的基础之上在question中添加了全部的类别信息以及对应token的映射map,所有的都一样只是进行了乱序
# V4
基于scannet_detection_train收集得到的数据,没有经过归一化处理,以后归一化处理全部在程序中进行,加上local guidance
# V5_normalized
在V4_normalized数据的基础之上将原始的box信息表示为 close to center 和 far from center
目前主要使用V4作为主要实验数据
点运数据使用scannet_detection_train为可视化没有发生偏移的数据 |
open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus | 2023-09-11T16:12:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T16:11:41.270351](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus/blob/main/results_2023-09-11T16-11-41.270351.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5509125885849774,\n\
\ \"acc_stderr\": 0.0344588285887975,\n \"acc_norm\": 0.555047768984873,\n\
\ \"acc_norm_stderr\": 0.03443868276596075,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.4284193316007184,\n\
\ \"mc2_stderr\": 0.014486178746194435\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097664,\n\
\ \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.014379441068522082\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6149173471420036,\n\
\ \"acc_stderr\": 0.004856203374715453,\n \"acc_norm\": 0.8213503286197968,\n\
\ \"acc_norm_stderr\": 0.003822758343922915\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695053,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695053\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.0368035037128646,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.0368035037128646\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815257,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815257\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4041720990873533,\n\
\ \"acc_stderr\": 0.012533504046491362,\n \"acc_norm\": 0.4041720990873533,\n\
\ \"acc_norm_stderr\": 0.012533504046491362\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.4284193316007184,\n\
\ \"mc2_stderr\": 0.014486178746194435\n }\n}\n```"
repo_url: https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-11-41.270351.parquet'
- config_name: results
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- results_2023-09-11T16-11-41.270351.parquet
- split: latest
path:
- results_2023-09-11T16-11-41.270351.parquet
---
# Dataset Card for Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T16:11:41.270351](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus/blob/main/results_2023-09-11T16-11-41.270351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5509125885849774,
"acc_stderr": 0.0344588285887975,
"acc_norm": 0.555047768984873,
"acc_norm_stderr": 0.03443868276596075,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.4284193316007184,
"mc2_stderr": 0.014486178746194435
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097664,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.014379441068522082
},
"harness|hellaswag|10": {
"acc": 0.6149173471420036,
"acc_stderr": 0.004856203374715453,
"acc_norm": 0.8213503286197968,
"acc_norm_stderr": 0.003822758343922915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.0368035037128646,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.0368035037128646
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815257,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815257
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4041720990873533,
"acc_stderr": 0.012533504046491362,
"acc_norm": 0.4041720990873533,
"acc_norm_stderr": 0.012533504046491362
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355568,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355568
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.4284193316007184,
"mc2_stderr": 0.014486178746194435
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DavidM94/Helper | 2023-09-11T16:20:42.000Z | [
"region:us"
] | DavidM94 | null | null | null | 0 | 0 | Entry not found |
v2ray/airoboros-2.2-dealignment | 2023-09-11T16:46:59.000Z | [
"license:other",
"region:us"
] | v2ray | null | null | null | 1 | 0 | ---
license: other
---
# Airoboros 2.2 Dealignment
This is a dealignment extraction of the airoboros-2.2 dataset which can be found [here](https://huggingface.co/datasets/jondurbin/airoboros-2.2).
**ALL CREDITS TO [@jondurbin](https://huggingface.co/jondurbin) FOR THIS AWESOME DATASET!**
**YOU MUST HAVE ACCESS TO THE ORIGINAL DATASET BEFORE REQUESTING ACCESS TO THIS DATASET!** \
(But I can't check if you actually have it or not so I set it to auto approval.)
# Original README.md
## Overview
This dataset is mostly a continuation of https://hf.co/datasets/jondurbin/airoboros-2.1, with some notable additions and fixes.
__*I've gated access with request, due to the de-alignment data. To download, you must agree to the following:*__
- Some of the content is "toxic"/"harmful", and contains profanity and other types of sensitive content.
- None of the content or views contained in text within this dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs and/or scraped from the web.
- Use with extreme caution, particularly in locations with less-than-free speech laws.
- You, and you alone are responsible for having downloaded the dataset and having a copy of the contents therein and I am completely indemnified from any and all liabilities.
### 2.1 Contamination
I accidentally included some of the benchmark data in the first version of the airboros-2.1 model, which is why it had a crazy high truthfulqa score. Discussions here:
- https://huggingface.co/jondurbin/airoboros-l2-70b-2.1/discussions/3#64f325ce352152814d1f796a
- https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/discussions/225#64f0997659da193a12b78c32
I flagged it for removal and recreated the model right away, but the leaderboard cached the old results so it took some time to reflect.
Some of the instructors I use create overlapping data, and it's hard to filter, especially since the instructions aren't typically verbatim with the benchmark questions.
This time around, I used `thenlper/gte-small` to calculate embeddings of the instructions, along with a faiss index, and removed anything from the dataset that had a similarity score < 0.15 (from truthfulqa). If you have a better way of checking, please let me know!
I haven't done the same for most other benchmarks (yet) because there are hundreds of thousands of instructions and it would be pretty computationally expensive to do. That said, I only have ~1279 multiple choice questions, all randomly GPT generated, so there's probably little-to-no overlap.
### Awareness
I added a new "awareness" instructor, which aims to add a lot more nuance to responses relating to time, location, senses, etc. based on the system prompt.
For example, if you are using the standard prompt with user/assistant, and ask how long it would take to get to Chicago, the answer will be something about AI not having a physical presence.
If, on the other hand, you are using a system prompt with a human character specified, the model attempts to infer location from "home" and will provide a more nuanced answer as a human would (in theory).
https://github.com/jondurbin/airoboros/commit/e91562c88d7610edb051606622e7c25a99884f7e
### Editor
I created a text edit instructor as well, which uses a reverse prompt mechanism, meaning it takes the existing writing samples that have been generated, rewrites them to have misspellings, poor grammar, etc., then uses a prompt like "Please correct and improve the text." with the original well-written text and target output.
https://github.com/jondurbin/airoboros/commit/e60a68de5f9622320c9cfff3b238bd83cc7e373b
### Writing
I regenerated (almost) all of the training data that included "Once upon a time..." because it's too cliche and boring.
### Multiple choice
I created many more multiple choice questions, many of which have additional text context.
### Roleplay/conversation
I re-created all of the GTKM and RP datasets this time around, removing all of the "USER: " and "ASSISTANT: " prefixes from the instructions/responses, so it's more compatible with existing interfaces.
The GTKM instructor now does the same thing as RP, in that it saves each round of "conversation" as a separate row in the output - previously it only saved the final response, which may not have been sufficient since I don't typically train on inputs.
### De-alignment
I included a small sampling of "de-alignment" data. The llama-2 base models seem extremely reluctant to discuss certain topics, curse, or otherwise produce other-than-pg content. I don't want a vile model, but I also don't *NOT* want a vile model.
- comedy skits, to add more comedy and occasional cursing
- instruction/response pairs that would typically otherwise be refused
- various (LLM ehanced) stories from the internet with somewhat spicy content
- story-writing tasks as a re-telling of popular horror/fantasy films (by default, the LLM generated stories often have too sunny of a disposition, so hopefully this will allow for some twists or more interesting stories)
- rude responses (if a character card specifies they are rude and curse, it should actually do so without prompt hacking IMO)
None of the content or views contained in text within this dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs and/or scraped from the web. Use with extreme caution, particularly in locations with strict speech laws!
See "instructions-clean.jsonl" for a version without dealignment data.
### UTF-8 to ASCII
I replaced most of the "standard" utf-8 sequences - left double quote, right double quote, left apostraphe, ellipses - with standard ascii characters. I don't know if this was contributing to part of the issue with eos tokens being produced after apostraphes, but I figured it was worth trying.
### Summarization
I also included 500 examples from:
https://hf.co/datasets/mattpscott/airoboros-summarization
These are existing summarizarions from various public datasets, formatted to airoboros style contextual qa.
Thanks Matt!
### Usage/license info
Much (most) of the data was generated via gpt-4 API calls, which has a restriction in the ToS about "competing" models. Please seek legal advice if you plan to build or use a model that includes this dataset in a commercial setting. |
open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental | 2023-09-11T16:46:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ahxt/llama2_xs_460M_experimental
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ahxt/llama2_xs_460M_experimental](https://huggingface.co/ahxt/llama2_xs_460M_experimental)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T16:45:07.137608](https://huggingface.co/datasets/open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental/blob/main/results_2023-09-11T16-45-07.137608.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26203176592138006,\n\
\ \"acc_stderr\": 0.031723424035979185,\n \"acc_norm\": 0.2635747128595633,\n\
\ \"acc_norm_stderr\": 0.031736820283279565,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.41591461733747837,\n\
\ \"mc2_stderr\": 0.01491393118316991\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2158703071672355,\n \"acc_stderr\": 0.012022975360030672,\n\
\ \"acc_norm\": 0.24914675767918087,\n \"acc_norm_stderr\": 0.012639407111926435\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3269269069906393,\n\
\ \"acc_stderr\": 0.004681316064444433,\n \"acc_norm\": 0.3846843258315077,\n\
\ \"acc_norm_stderr\": 0.004855262903270804\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517904,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517904\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895688,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895688\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594525,\n \"\
acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.03410780251836183,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.03410780251836183\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n\
\ \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882357,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882357\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29541284403669726,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.29541284403669726,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.02955429260569506,\n\
\ \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.02955429260569506\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906942,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906942\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.024413587174907426,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.024413587174907426\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n\
\ \"acc_stderr\": 0.02704685763071666,\n \"acc_norm\": 0.21794871794871795,\n\
\ \"acc_norm_stderr\": 0.02704685763071666\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818723,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818723\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19292604501607716,\n\
\ \"acc_stderr\": 0.022411516780911363,\n \"acc_norm\": 0.19292604501607716,\n\
\ \"acc_norm_stderr\": 0.022411516780911363\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2685788787483703,\n\
\ \"acc_stderr\": 0.011320056629121734,\n \"acc_norm\": 0.2685788787483703,\n\
\ \"acc_norm_stderr\": 0.011320056629121734\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23366013071895425,\n \"acc_stderr\": 0.017119158496044503,\n \
\ \"acc_norm\": 0.23366013071895425,\n \"acc_norm_stderr\": 0.017119158496044503\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721377,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721377\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879815,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879815\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\
\ \"acc_stderr\": 0.031871875379197986,\n \"acc_norm\": 0.2835820895522388,\n\
\ \"acc_norm_stderr\": 0.031871875379197986\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.41591461733747837,\n\
\ \"mc2_stderr\": 0.01491393118316991\n }\n}\n```"
repo_url: https://huggingface.co/ahxt/llama2_xs_460M_experimental
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-45-07.137608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-45-07.137608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-45-07.137608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-45-07.137608.parquet'
- config_name: results
data_files:
- split: 2023_09_11T16_45_07.137608
path:
- results_2023-09-11T16-45-07.137608.parquet
- split: latest
path:
- results_2023-09-11T16-45-07.137608.parquet
---
# Dataset Card for Evaluation run of ahxt/llama2_xs_460M_experimental
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ahxt/llama2_xs_460M_experimental
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ahxt/llama2_xs_460M_experimental](https://huggingface.co/ahxt/llama2_xs_460M_experimental) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T16:45:07.137608](https://huggingface.co/datasets/open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental/blob/main/results_2023-09-11T16-45-07.137608.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26203176592138006,
"acc_stderr": 0.031723424035979185,
"acc_norm": 0.2635747128595633,
"acc_norm_stderr": 0.031736820283279565,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.41591461733747837,
"mc2_stderr": 0.01491393118316991
},
"harness|arc:challenge|25": {
"acc": 0.2158703071672355,
"acc_stderr": 0.012022975360030672,
"acc_norm": 0.24914675767918087,
"acc_norm_stderr": 0.012639407111926435
},
"harness|hellaswag|10": {
"acc": 0.3269269069906393,
"acc_stderr": 0.004681316064444433,
"acc_norm": 0.3846843258315077,
"acc_norm_stderr": 0.004855262903270804
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517904,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517904
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895688,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895688
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594525,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.03410780251836183,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.03410780251836183
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882357,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882357
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907426,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907426
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02704685763071666,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02704685763071666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818723,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818723
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19292604501607716,
"acc_stderr": 0.022411516780911363,
"acc_norm": 0.19292604501607716,
"acc_norm_stderr": 0.022411516780911363
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2685788787483703,
"acc_stderr": 0.011320056629121734,
"acc_norm": 0.2685788787483703,
"acc_norm_stderr": 0.011320056629121734
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23366013071895425,
"acc_stderr": 0.017119158496044503,
"acc_norm": 0.23366013071895425,
"acc_norm_stderr": 0.017119158496044503
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879815,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879815
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197986,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197986
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.41591461733747837,
"mc2_stderr": 0.01491393118316991
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft1 | 2023-09-11T17:01:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikivis/gpt2-large-lora-sft1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikivis/gpt2-large-lora-sft1](https://huggingface.co/Mikivis/gpt2-large-lora-sft1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:00:06.378151](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft1/blob/main/results_2023-09-11T17-00-06.378151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25023348485813046,\n\
\ \"acc_stderr\": 0.031398973577972646,\n \"acc_norm\": 0.25188038577213395,\n\
\ \"acc_norm_stderr\": 0.0314103954161941,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.01484306150773162,\n \"mc2\": 0.39365434891343787,\n\
\ \"mc2_stderr\": 0.014312149499056473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21843003412969283,\n \"acc_stderr\": 0.012074291605700975,\n\
\ \"acc_norm\": 0.24658703071672355,\n \"acc_norm_stderr\": 0.012595726268790124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35769766978689504,\n\
\ \"acc_stderr\": 0.004783428874273593,\n \"acc_norm\": 0.4267078271260705,\n\
\ \"acc_norm_stderr\": 0.004935882666250471\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.024993053397764822,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.024993053397764822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.032210245080411565,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.032210245080411565\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222728,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222728\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863797,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479662,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479662\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046934,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.047211885060971744,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.047211885060971744\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.01559495538445577,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.01559495538445577\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553984,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553984\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n\
\ \"acc_stderr\": 0.026160584450140488,\n \"acc_norm\": 0.3054662379421222,\n\
\ \"acc_norm_stderr\": 0.026160584450140488\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.01106415102716544,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.01106415102716544\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.02277086801011303,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.02277086801011303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21241830065359477,\n \"acc_stderr\": 0.016547148636203147,\n \
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.016547148636203147\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.039559328617958335,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.039559328617958335\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071855,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071855\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.01484306150773162,\n \"mc2\": 0.39365434891343787,\n\
\ \"mc2_stderr\": 0.014312149499056473\n }\n}\n```"
repo_url: https://huggingface.co/Mikivis/gpt2-large-lora-sft1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-00-06.378151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-00-06.378151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-00-06.378151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-00-06.378151.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_00_06.378151
path:
- results_2023-09-11T17-00-06.378151.parquet
- split: latest
path:
- results_2023-09-11T17-00-06.378151.parquet
---
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-sft1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-sft1](https://huggingface.co/Mikivis/gpt2-large-lora-sft1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:00:06.378151](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft1/blob/main/results_2023-09-11T17-00-06.378151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25023348485813046,
"acc_stderr": 0.031398973577972646,
"acc_norm": 0.25188038577213395,
"acc_norm_stderr": 0.0314103954161941,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.01484306150773162,
"mc2": 0.39365434891343787,
"mc2_stderr": 0.014312149499056473
},
"harness|arc:challenge|25": {
"acc": 0.21843003412969283,
"acc_stderr": 0.012074291605700975,
"acc_norm": 0.24658703071672355,
"acc_norm_stderr": 0.012595726268790124
},
"harness|hellaswag|10": {
"acc": 0.35769766978689504,
"acc_stderr": 0.004783428874273593,
"acc_norm": 0.4267078271260705,
"acc_norm_stderr": 0.004935882666250471
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764822,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.032210245080411565,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.032210245080411565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222728,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222728
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863797,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479662,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046934,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.047211885060971744,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.047211885060971744
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.01559495538445577,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.01559495538445577
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553984,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553984
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3054662379421222,
"acc_stderr": 0.026160584450140488,
"acc_norm": 0.3054662379421222,
"acc_norm_stderr": 0.026160584450140488
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.01106415102716544,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.01106415102716544
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.02277086801011303,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.02277086801011303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.016547148636203147,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.016547148636203147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.039559328617958335,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.039559328617958335
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071855,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071855
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.01484306150773162,
"mc2": 0.39365434891343787,
"mc2_stderr": 0.014312149499056473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_200 | 2023-09-11T17:08:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of _fsx_shared-falcon-180B_converted_200
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [_fsx_shared-falcon-180B_converted_200](https://huggingface.co/_fsx_shared-falcon-180B_converted_200)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_200\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:08:24.221910](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_200/blob/main/results_2023-09-11T17-08-24.221910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6219568610455939,\n\
\ \"mc2_stderr\": 0.01539764866197052\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n\
\ \"mc2\": 0.6219568610455939,\n \"mc2_stderr\": 0.01539764866197052\n\
\ }\n}\n```"
repo_url: https://huggingface.co/_fsx_shared-falcon-180B_converted_200
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_08_24.221910
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-08-24.221910.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-08-24.221910.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_08_24.221910
path:
- results_2023-09-11T17-08-24.221910.parquet
- split: latest
path:
- results_2023-09-11T17-08-24.221910.parquet
---
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_converted_200
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/_fsx_shared-falcon-180B_converted_200
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_converted_200](https://huggingface.co/_fsx_shared-falcon-180B_converted_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_200",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:08:24.221910](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_converted_200/blob/main/results_2023-09-11T17-08-24.221910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6219568610455939,
"mc2_stderr": 0.01539764866197052
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6219568610455939,
"mc2_stderr": 0.01539764866197052
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__Nova-13B-50-step | 2023-09-11T17:11:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Nova-13B-50-step
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Nova-13B-50-step](https://huggingface.co/TFLai/Nova-13B-50-step) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Nova-13B-50-step\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:10:10.050390](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B-50-step/blob/main/results_2023-09-11T17-10-10.050390.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5733454313030137,\n\
\ \"acc_stderr\": 0.03437268414009844,\n \"acc_norm\": 0.5776852246487773,\n\
\ \"acc_norm_stderr\": 0.0343503409094055,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5153458883991275,\n\
\ \"mc2_stderr\": 0.01543073554546003\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256519,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6199960167297351,\n\
\ \"acc_stderr\": 0.004843954338451441,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.003807680331172903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n\
\ \"acc_stderr\": 0.015864506461604644,\n \"acc_norm\": 0.3418994413407821,\n\
\ \"acc_norm_stderr\": 0.015864506461604644\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.02824513402438729,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.02824513402438729\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934016,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934016\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507898,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507898\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087368,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872468,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872468\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5153458883991275,\n\
\ \"mc2_stderr\": 0.01543073554546003\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Nova-13B-50-step
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-10-10.050390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-10-10.050390.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-10-10.050390.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-10-10.050390.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_10_10.050390
path:
- results_2023-09-11T17-10-10.050390.parquet
- split: latest
path:
- results_2023-09-11T17-10-10.050390.parquet
---
# Dataset Card for Evaluation run of TFLai/Nova-13B-50-step
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Nova-13B-50-step
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Nova-13B-50-step](https://huggingface.co/TFLai/Nova-13B-50-step) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Nova-13B-50-step",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:10:10.050390](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B-50-step/blob/main/results_2023-09-11T17-10-10.050390.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5733454313030137,
"acc_stderr": 0.03437268414009844,
"acc_norm": 0.5776852246487773,
"acc_norm_stderr": 0.0343503409094055,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5153458883991275,
"mc2_stderr": 0.01543073554546003
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256519,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6199960167297351,
"acc_stderr": 0.004843954338451441,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.003807680331172903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865151,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865151
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3418994413407821,
"acc_stderr": 0.015864506461604644,
"acc_norm": 0.3418994413407821,
"acc_norm_stderr": 0.015864506461604644
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934016,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934016
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507898,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507898
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087368,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872468,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872468
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5153458883991275,
"mc2_stderr": 0.01543073554546003
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified1 | 2023-09-11T17:14:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:12:40.999408](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified1/blob/main/results_2023-09-11T17-12-40.999408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4735381344797782,\n\
\ \"acc_stderr\": 0.03503713249545751,\n \"acc_norm\": 0.4774473391210312,\n\
\ \"acc_norm_stderr\": 0.03503294446540418,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474209,\n \"mc2\": 0.3986793440080272,\n\
\ \"mc2_stderr\": 0.01468826269806486\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.363481228668942,\n \"acc_stderr\": 0.014056207319068283,\n\
\ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427008\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5485958972316272,\n\
\ \"acc_stderr\": 0.004966158142645412,\n \"acc_norm\": 0.734017128062139,\n\
\ \"acc_norm_stderr\": 0.004409521343140117\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750573,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750573\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\"\
: 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056129,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056129\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.024784316942156378,\n\
\ \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.024784316942156378\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6422018348623854,\n \"acc_stderr\": 0.020552060784827835,\n \"\
acc_norm\": 0.6422018348623854,\n \"acc_norm_stderr\": 0.020552060784827835\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.039265223787088445,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.039265223787088445\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.048657775704107696,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.048657775704107696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n\
\ \"acc_stderr\": 0.028911208802749493,\n \"acc_norm\": 0.7350427350427351,\n\
\ \"acc_norm_stderr\": 0.028911208802749493\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.685823754789272,\n\
\ \"acc_stderr\": 0.016599291735884907,\n \"acc_norm\": 0.685823754789272,\n\
\ \"acc_norm_stderr\": 0.016599291735884907\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497735,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497735\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528777,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528777\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606672,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606672\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37809647979139505,\n\
\ \"acc_stderr\": 0.012384878406798095,\n \"acc_norm\": 0.37809647979139505,\n\
\ \"acc_norm_stderr\": 0.012384878406798095\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596452,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596452\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46405228758169936,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474209,\n \"mc2\": 0.3986793440080272,\n\
\ \"mc2_stderr\": 0.01468826269806486\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-12-40.999408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-12-40.999408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-12-40.999408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-12-40.999408.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_12_40.999408
path:
- results_2023-09-11T17-12-40.999408.parquet
- split: latest
path:
- results_2023-09-11T17-12-40.999408.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:12:40.999408](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified1/blob/main/results_2023-09-11T17-12-40.999408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4735381344797782,
"acc_stderr": 0.03503713249545751,
"acc_norm": 0.4774473391210312,
"acc_norm_stderr": 0.03503294446540418,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474209,
"mc2": 0.3986793440080272,
"mc2_stderr": 0.01468826269806486
},
"harness|arc:challenge|25": {
"acc": 0.363481228668942,
"acc_stderr": 0.014056207319068283,
"acc_norm": 0.4087030716723549,
"acc_norm_stderr": 0.014365750345427008
},
"harness|hellaswag|10": {
"acc": 0.5485958972316272,
"acc_stderr": 0.004966158142645412,
"acc_norm": 0.734017128062139,
"acc_norm_stderr": 0.004409521343140117
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750573,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750573
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056129,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056129
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.39487179487179486,
"acc_stderr": 0.024784316942156378,
"acc_norm": 0.39487179487179486,
"acc_norm_stderr": 0.024784316942156378
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6422018348623854,
"acc_stderr": 0.020552060784827835,
"acc_norm": 0.6422018348623854,
"acc_norm_stderr": 0.020552060784827835
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03283472056108561,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03283472056108561
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.039265223787088445,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.039265223787088445
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.048657775704107696,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.048657775704107696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749493,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749493
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.685823754789272,
"acc_stderr": 0.016599291735884907,
"acc_norm": 0.685823754789272,
"acc_norm_stderr": 0.016599291735884907
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497735,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528777,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528777
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606672,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606672
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37809647979139505,
"acc_stderr": 0.012384878406798095,
"acc_norm": 0.37809647979139505,
"acc_norm_stderr": 0.012384878406798095
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596452,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474209,
"mc2": 0.3986793440080272,
"mc2_stderr": 0.01468826269806486
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified2 | 2023-09-11T17:16:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-11T17:15:24.260844](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified2/blob/main/results_2023-09-11T17-15-24.260844.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48431322113675834,\n\
\ \"acc_stderr\": 0.035096519029320775,\n \"acc_norm\": 0.48827018319996285,\n\
\ \"acc_norm_stderr\": 0.0350905391376541,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.40427814325747613,\n\
\ \"mc2_stderr\": 0.014149691859006174\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.386518771331058,\n \"acc_stderr\": 0.014230084761910471,\n\
\ \"acc_norm\": 0.42918088737201365,\n \"acc_norm_stderr\": 0.014464085894870653\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5488946425014938,\n\
\ \"acc_stderr\": 0.0049658660983181715,\n \"acc_norm\": 0.7396932881896037,\n\
\ \"acc_norm_stderr\": 0.004379051357024145\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808086,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808086\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5548387096774193,\n \"acc_stderr\": 0.02827241018621491,\n \"\
acc_norm\": 0.5548387096774193,\n \"acc_norm_stderr\": 0.02827241018621491\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036544,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036544\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.0346488167501634,\n \"acc_norm\"\
: 0.6161616161616161,\n \"acc_norm_stderr\": 0.0346488167501634\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415318,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\"\
: 0.6813725490196079,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \"acc_norm\"\
: 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n\
\ \"acc_stderr\": 0.02891120880274948,\n \"acc_norm\": 0.7350427350427351,\n\
\ \"acc_norm_stderr\": 0.02891120880274948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6807151979565773,\n\
\ \"acc_stderr\": 0.01667126174953874,\n \"acc_norm\": 0.6807151979565773,\n\
\ \"acc_norm_stderr\": 0.01667126174953874\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.0281739177617629,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.0281739177617629\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3650586701434159,\n\
\ \"acc_stderr\": 0.012296373743443478,\n \"acc_norm\": 0.3650586701434159,\n\
\ \"acc_norm_stderr\": 0.012296373743443478\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969768,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.03171752824062664,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.03171752824062664\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339192,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339192\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.40427814325747613,\n\
\ \"mc2_stderr\": 0.014149691859006174\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-15-24.260844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-15-24.260844.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-15-24.260844.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-15-24.260844.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_15_24.260844
path:
- results_2023-09-11T17-15-24.260844.parquet
- split: latest
path:
- results_2023-09-11T17-15-24.260844.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mctaco-modified2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T17:15:24.260844](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mctaco-modified2/blob/main/results_2023-09-11T17-15-24.260844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48431322113675834,
"acc_stderr": 0.035096519029320775,
"acc_norm": 0.48827018319996285,
"acc_norm_stderr": 0.0350905391376541,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.40427814325747613,
"mc2_stderr": 0.014149691859006174
},
"harness|arc:challenge|25": {
"acc": 0.386518771331058,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.42918088737201365,
"acc_norm_stderr": 0.014464085894870653
},
"harness|hellaswag|10": {
"acc": 0.5488946425014938,
"acc_stderr": 0.0049658660983181715,
"acc_norm": 0.7396932881896037,
"acc_norm_stderr": 0.004379051357024145
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808086,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808086
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5548387096774193,
"acc_stderr": 0.02827241018621491,
"acc_norm": 0.5548387096774193,
"acc_norm_stderr": 0.02827241018621491
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036544,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036544
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.0346488167501634,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.0346488167501634
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415318,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.02891120880274948,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.02891120880274948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6807151979565773,
"acc_stderr": 0.01667126174953874,
"acc_norm": 0.6807151979565773,
"acc_norm_stderr": 0.01667126174953874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.0281739177617629,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.0281739177617629
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3650586701434159,
"acc_stderr": 0.012296373743443478,
"acc_norm": 0.3650586701434159,
"acc_norm_stderr": 0.012296373743443478
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339192,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339192
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.037777988227480165,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.037777988227480165
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.40427814325747613,
"mc2_stderr": 0.014149691859006174
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.