datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
stepkurniawan/sustainability-methods-wiki | ---
license: mit
configs:
- config_name: 50_QA
data_files:
- split: train
path: 50_QA/train-*
- config_name: 50_QA_reviewed
data_files:
- split: train
path: 50_QA_reviewed/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
- config_name: 50_QA
features:
- name: contexts
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: ground_truths
dtype: string
splits:
- name: train
num_bytes: 78182
num_examples: 50
download_size: 57005
dataset_size: 78182
- config_name: 50_QA_reviewed
features:
- name: contexts
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: ground_truths
dtype: string
splits:
- name: train
num_bytes: 78147
num_examples: 50
download_size: 56945
dataset_size: 78147
---
This is a table dump from Prof. Henrik van Wehrden's famous sustainability wiki. He is a sustainability professor in Leuphana University, Germany, and passionate about digitalizing his mind. Therefore, the wiki is born.
This Wiki pages are focused on sustainability and highly subjective on his view of the world.
Link: https://sustainabilitymethods.org/index.php/Main_Page |
CyberHarem/tallinn_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tallinn/タリン/塔林 (Azur Lane)
This is the dataset of tallinn/タリン/塔林 (Azur Lane), containing 65 images and their tags.
The core tags of this character are `breasts, long_hair, multicolored_hair, red_hair, large_breasts, streaked_hair, red_eyes, mole, mole_on_breast, white_hair, two-tone_hair, hat, bangs, very_long_hair, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 65 | 120.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tallinn_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 65 | 58.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tallinn_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 129.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tallinn_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 65 | 101.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tallinn_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 193.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tallinn_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tallinn_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, elbow_gloves, white_dress, white_gloves, bare_shoulders, white_coat, peaked_cap, thighhighs, thigh_boots, white_footwear, fur_trim, simple_background, cross-laced_footwear, parted_bangs, sleeveless_dress, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, official_alternate_costume, black_bra, bare_shoulders, black_choker, black_thighhighs, blush, navel, black_shorts, black_tank_top, collarbone, garter_straps, closed_mouth, indoors, parted_bangs, simple_background, couch, crop_top, feet_out_of_frame, grey_hair |
| 2 | 5 |  |  |  |  |  | 1girl, black_belt, black_gloves, cleavage, handcuffs, looking_at_viewer, official_alternate_costume, police_uniform, solo, sunglasses, black_bra, black_choker, black_footwear, black_headwear, high_heels, holding, open_clothes, police_hat, black_shirt, boots, bra_peek, eyewear_on_head, full_body, light_purple_hair, phone, short_sleeves, sitting, thigh_strap, black_skirt, blush, indoors, on_desk, paper, peaked_cap, potted_plant |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | elbow_gloves | white_dress | white_gloves | bare_shoulders | white_coat | peaked_cap | thighhighs | thigh_boots | white_footwear | fur_trim | simple_background | cross-laced_footwear | parted_bangs | sleeveless_dress | white_background | official_alternate_costume | black_bra | black_choker | black_thighhighs | blush | navel | black_shorts | black_tank_top | collarbone | garter_straps | closed_mouth | indoors | couch | crop_top | feet_out_of_frame | grey_hair | black_belt | black_gloves | handcuffs | police_uniform | sunglasses | black_footwear | black_headwear | high_heels | holding | open_clothes | police_hat | black_shirt | boots | bra_peek | eyewear_on_head | full_body | light_purple_hair | phone | short_sleeves | sitting | thigh_strap | black_skirt | on_desk | paper | potted_plant |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:---------------|:--------------|:---------------|:-----------------|:-------------|:-------------|:-------------|:--------------|:-----------------|:-----------|:--------------------|:-----------------------|:---------------|:-------------------|:-------------------|:-----------------------------|:------------|:---------------|:-------------------|:--------|:--------|:---------------|:-----------------|:-------------|:----------------|:---------------|:----------|:--------|:-----------|:--------------------|:------------|:-------------|:---------------|:------------|:-----------------|:-------------|:-----------------|:-----------------|:-------------|:----------|:---------------|:-------------|:--------------|:--------|:-----------|:------------------|:------------|:--------------------|:--------|:----------------|:----------|:--------------|:--------------|:----------|:--------|:---------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | | | | X | | | | | | | | | | X | X | X | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ChiyuSONG/Uni-Encoder | ---
license: mit
task_categories:
- conversational
language:
- en
- zh
---
<p align="center">
💻 <a href="https://github.com/dll-wu/Uni-Encoder" target="_blank">[Github Repo]</a> • 📃 <a href="https://arxiv.org/abs/2106.01263" target="_blank">[Paper]</a>
</p>
## Overview
This a collection of datasets used in the paper titled "Uni-Encoder: A Fast and Accurate Response Selection Paradigm for Generation-Based Dialogue Systems".
The following datasets have been included:
- Ubuntu Corpus V1
- Ubuntu Corpus V2
- PersonaChat
- Douban Conv Corpus
All datasets have been standardized to a unified format for research need.
## Citation
```
@inproceedings{song2023uni,
title={Uni-encoder: A fast and accurate response selection paradigm for generation-based dialogue systems},
author={Song, Chiyu and He, Hongliang and Yu, Haofei and Fang, Pengfei and Cui, Leyang and Lan, Zhenzhong},
booktitle={Findings of the Association for Computational Linguistics: ACL 2023},
pages={6231--6244},
year={2023}
}
```
|
liuyanchen1015/MULTI_VALUE_sst2_fronting_pobj | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 55925
num_examples: 416
- name: test
num_bytes: 115114
num_examples: 866
- name: train
num_bytes: 2084328
num_examples: 20542
download_size: 1368224
dataset_size: 2255367
---
# Dataset Card for "MULTI_VALUE_sst2_fronting_pobj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stoptalk344/vacuum1333 | ---
license: apache-2.0
---
|
Ansh007/Jellyfish-Image-Dataset | ---
license: cc-by-4.0
---
# Summary
>This dataset contains 900 images of jellyfish belonging to six different categories and species: mauve stinger jellyfish, moon jellyfish, barrel jellyfish, blue jellyfish, compass jellyfish, and lion’s mane jellyfish. You can apply ML techniques to gain insights into jellyfish classification, species identification, and color analysis.
# More interesting datasets tailored to your requirements
## Leave a request at [https://trainingdata.pro/data-market](https://trainingdata.pro/data-market?utm_source=kaggle-partner-anshtanwar&utm_medium=cpc&utm_campaign=jellyfish-types) to discuss your requirements and order a similar dataset tailored to your research, project or business.
# Types of Jellyfish - Description
1. **Moon jellyfish (Aurelia aurita)**: Common jellyfish with four horseshoe-shaped gonads visible through the top of its translucent bell. It feeds by collecting medusae, plankton, and mollusks with its tentacles.<br>
2. **Barrel jellyfish (Rhizostoma pulmo)**: Largest jellyfish found in British waters, with a bell that can grow up to 90 cm in diameter. It feeds on plankton and small fish by catching them in its tentacles.<br>
3. **Blue jellyfish (Cyanea lamarckii)**: Large jellyfish that can grow up to 30 cm in diameter. It feeds on plankton and small fish by catching them in its tentacles.<br>
4. **Compass jellyfish (Chrysaora hysoscella)**: Named after the brown markings on its bell that resemble a compass rose. It feeds on plankton and small fish by catching them in its tentacles.<br>
5. **Lion’s mane jellyfish (Cyanea capillata)**: Largest jellyfish in the world, with a bell that can grow up to 2 meters in diameter and tentacles that can reach up to 30 meters in length. It feeds on plankton and small fish by catching them in its tentacles.<br>
6. M**auve stinger (Pelagia noctiluca)**: Small jellyfish with long tentacles and warty structures on its bell full of stinging cells. It feeds on other small jellyfish and oceanic sea squirts.<br>
# Use Cases
>- **Jellyfish classification**: Use machine learning techniques to classify jellyfish images into different categories based on their physical characteristics.<br>
- **Species identification**: Use machine learning techniques to identify the species of jellyfish in your dataset based on their physical characteristics.<br>
- **Color analysis**: Use machine learning techniques to analyze the color patterns of jellyfish in your dataset.<br>
# Order Data Collection and Annotation tailored to your specifications.
# [https://trainingdata.pro/data-market]( https://trainingdata.pro/data-market?utm_source=kaggle-partner-anshtanwar&utm_medium=cpc&utm_campaign=jellyfish-types) offers high-quality data annotation tailored to your needs.
|
open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-Pythia-160m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-Pythia-160m](https://huggingface.co/princeton-nlp/Sheared-Pythia-160m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T11:51:47.160529](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m/blob/main/results_2024-03-05T11-51-47.160529.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.265486732132447,\n\
\ \"acc_stderr\": 0.03103900531467752,\n \"acc_norm\": 0.2667178847012967,\n\
\ \"acc_norm_stderr\": 0.03183921317983812,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.4322455282459343,\n\
\ \"mc2_stderr\": 0.015239085992311467\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675815,\n\
\ \"acc_norm\": 0.22440273037542663,\n \"acc_norm_stderr\": 0.012191404938603833\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2940649273053177,\n\
\ \"acc_stderr\": 0.004546901132945137,\n \"acc_norm\": 0.32065325632344155,\n\
\ \"acc_norm_stderr\": 0.0046577383989009355\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677084,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677084\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415426,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415426\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517414,\n \"\
acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517414\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626303,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626303\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.344954128440367,\n \"acc_stderr\": 0.02038060540506697,\n \"acc_norm\"\
: 0.344954128440367,\n \"acc_norm_stderr\": 0.02038060540506697\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n\
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.03623089915724148,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.03623089915724148\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.1979565772669221,\n\
\ \"acc_stderr\": 0.014248873549217589,\n \"acc_norm\": 0.1979565772669221,\n\
\ \"acc_norm_stderr\": 0.014248873549217589\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348776,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348776\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.03070982405056527,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.03070982405056527\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.4322455282459343,\n\
\ \"mc2_stderr\": 0.015239085992311467\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.01404439040161298\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501906\n }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-Pythia-160m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|arc:challenge|25_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|gsm8k|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hellaswag|10_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-51-47.160529.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T11-51-47.160529.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- '**/details_harness|winogrande|5_2024-03-05T11-51-47.160529.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T11-51-47.160529.parquet'
- config_name: results
data_files:
- split: 2024_03_05T11_51_47.160529
path:
- results_2024-03-05T11-51-47.160529.parquet
- split: latest
path:
- results_2024-03-05T11-51-47.160529.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-Pythia-160m
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-Pythia-160m](https://huggingface.co/princeton-nlp/Sheared-Pythia-160m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T11:51:47.160529](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m/blob/main/results_2024-03-05T11-51-47.160529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.265486732132447,
"acc_stderr": 0.03103900531467752,
"acc_norm": 0.2667178847012967,
"acc_norm_stderr": 0.03183921317983812,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.4322455282459343,
"mc2_stderr": 0.015239085992311467
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675815,
"acc_norm": 0.22440273037542663,
"acc_norm_stderr": 0.012191404938603833
},
"harness|hellaswag|10": {
"acc": 0.2940649273053177,
"acc_stderr": 0.004546901132945137,
"acc_norm": 0.32065325632344155,
"acc_norm_stderr": 0.0046577383989009355
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415426,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415426
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.02038060540506697,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.02038060540506697
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.03623089915724148,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.03623089915724148
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.1979565772669221,
"acc_stderr": 0.014248873549217589,
"acc_norm": 0.1979565772669221,
"acc_norm_stderr": 0.014248873549217589
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348776,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348776
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355575,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355575
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056527,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056527
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.4322455282459343,
"mc2_stderr": 0.015239085992311467
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.01404439040161298
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501906
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aneeshas/imsdb-sci-fi-movie-scripts | ---
dataset_info:
features:
- name: Sci-Fi
dtype: string
splits:
- name: train
num_bytes: 35727494
num_examples: 150
download_size: 16207093
dataset_size: 35727494
---
# Dataset Card for "imsdb-sci-fi-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sigurdur/icelandic-qa-hugi | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 255965534
num_examples: 387833
download_size: 158309799
dataset_size: 255965534
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
language:
- is
pretty_name: Icelandic question-answering dataset
---
# Icelandic question-answering dataset
The same dataset as in https://huggingface.co/datasets/Sigurdur/hugi_korkar but the first response has been saved, the rest have been thrown out.
The dataset is still not cleaned and may contain question answer pair that is not for all audiences.
Author: Sigurdur Haukur Birgisson
|
adityarra07/ATC_train_noise | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 10104439114
num_examples: 22152
- name: test
num_bytes: 227942352
num_examples: 500
download_size: 10344802156
dataset_size: 10332381466
---
# Dataset Card for "ATC_train_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DL0628/LayoutLMv3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: bboxes
sequence:
sequence: float64
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 27416070.0
num_examples: 76
- name: test
num_bytes: 1725594.0
num_examples: 5
- name: validation
num_bytes: 3725970.0
num_examples: 9
download_size: 25385590
dataset_size: 32867634.0
---
# Dataset Card for "LayoutLMv3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bloyal/uniref50 | ---
dataset_info:
features:
- name: ids
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 19549591530
num_examples: 62759891
download_size: 18546997577
dataset_size: 19549591530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
---
# Dataset Card for UniRef50
UniRef50 data downloaded from https://www.uniprot.org/help/downloads on January 24, 2024. |
distilled-from-one-sec-cv12/chunk_90 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1306720104
num_examples: 254622
download_size: 1334186776
dataset_size: 1306720104
---
# Dataset Card for "chunk_90"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quyanh/cot | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3530400.0
num_examples: 9000
download_size: 2120620
dataset_size: 3530400.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhenganlin/test-dataset | ---
license: openrail
---
|
tr416/literalist_ds | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 348610
num_examples: 269
download_size: 182438
dataset_size: 348610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "literalist_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alasdairforsythe/text-english-code-fiction-nonfiction | ---
language:
- en
pretty_name: 'TokenMonster Datasets: English, Code, Fiction, Non-fiction'
size_categories:
- 1B<n<10B
tags:
- text
- english
- fiction
- nonfiction
- non-fiction
- modern fiction
- contemporary fiction
- fiction dataset
- code dataset
- english dataset
- code
- code samples
- tokenization
- tokenization datasets
- datasets
task_categories:
- text-generation
---
## TokenMonster Datasets: English, Code, Fiction, Non-fiction
Included are datasets that were used to generate the TokenMonster pre-built vocabularies. All are raw text files.
The training data mostly came from Red Pajamas [1B Token Sample](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T-Sample). However, to reduce formal English and emphasize other languages, informal writing and code, c4_sample & cc_sample were cropped to 100MB, and [Reddit conversations](https://huggingface.co/datasets/SophieTr/reddit_clean) data were added (also cropped to 100MB.)
Additionally, equally weighted `code` samples of 2MB per language (code_2mb) and 10MB per language (code_10mb) were added for 30 different programming languages to ensure all programming languages have representation. The source of the `code` samples was [codeparrot/github-code](https://huggingface.co/datasets/codeparrot/github-code). To ensure a range of coding styles, I allowed only 1 file per GitHub repository, and per file a maximum of 200 lines selected from the middle of the file.
Given the evolving nature of writing styles, I felt that `book_sample.txt`, which consists of out-of-copyright books, was not a good representation of contemporary fiction. To better represent a more modern style, I curated `fiction.txt` and `fiction_100mb.txt` by throwing together a few other datasets and cleaning it up.
| Filename | Filesize |
|--------------------------|-----------|
| arxiv_sample.txt | 88,925,569 |
| book_sample.txt | 108,069,616 |
| c4_sample.txt | 100,560,318 |
| cc_2023-06_sample.txt | 100,852,231 |
| code_2mb.txt | 62,895,904 |
| code_10mb.txt | 314,006,799 |
| fiction.txt | 357,119,086 |
| fiction_100mb.txt | 94,235,489 |
| github_sample.txt | 191,123,094 |
| stackexchange_sample.txt | 71,940,138 |
| wikipedia_sample.txt | 79,181,873 |
| reddit.txt | 100,027,565 |
Note: `fiction_100mb.txt` is a subset of `fiction.txt`, and `code_2mb.txt` is a subset of `code_10mb.txt`.
### License
* [Common Crawl Foundation Terms of Use](https://commoncrawl.org/terms-of-use/full/)
* [C4 license](https://huggingface.co/datasets/allenai/c4#license)
* [the_pile_books3 license](https://huggingface.co/datasets/the_pile_books3#licensing-information) and [pg19 license](https://huggingface.co/datasets/pg19#licensing-information)
* [ArXiv Terms of Use](https://info.arxiv.org/help/api/tou.html)
* [Wikipedia License](https://huggingface.co/datasets/wikipedia#licensing-information)
* [StackExchange license on the Internet Archive](https://archive.org/details/stackexchange) |
SEACrowd/indo4b | ---
tags:
- self-supervised-pretraining
language:
- ind
---
# indo4b
Indo4B is a large-scale Indonesian self-supervised pre-training corpus
consists of around 3.6B words, with around 250M sentences. The corpus
covers both formal and colloquial Indonesian sentences compiled from
12 sources, of which two cover Indonesian colloquial language, eight
cover formal Indonesian language, and the rest have a mixed style of
both colloquial and formal.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{wilie-etal-2020-indonlu,
title = "{I}ndo{NLU}: Benchmark and Resources for Evaluating {I}ndonesian
Natural Language Understanding",
author = "Wilie, Bryan and
Vincentio, Karissa and
Winata, Genta Indra and
Cahyawijaya, Samuel and
Li, Xiaohong and
Lim, Zhi Yuan and
Soleman, Sidik and
Mahendra, Rahmad and
Fung, Pascale and
Bahar, Syafri and
Purwarianti, Ayu",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the
Association for Computational Linguistics and the 10th International Joint
Conference on Natural Language Processing",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-main.85",
pages = "843--857",
abstract = "Although Indonesian is known to be the fourth most frequently used language
over the internet, the research progress on this language in natural language processing (NLP)
is slow-moving due to a lack of available resources. In response, we introduce the first-ever vast
resource for training, evaluation, and benchmarking on Indonesian natural language understanding
(IndoNLU) tasks. IndoNLU includes twelve tasks, ranging from single sentence classification to
pair-sentences sequence labeling with different levels of complexity. The datasets for the tasks
lie in different domains and styles to ensure task diversity. We also provide a set of Indonesian
pre-trained models (IndoBERT) trained from a large and clean Indonesian dataset (Indo4B) collected
from publicly available sources such as social media texts, blogs, news, and websites.
We release baseline models for all twelve tasks, as well as the framework for benchmark evaluation,
thus enabling everyone to benchmark their system performances.",
}
```
## License
CC0
## Homepage
[https://github.com/IndoNLP/indonlu](https://github.com/IndoNLP/indonlu)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
SeanWu25/NEJM-AI_Benchmarking_Medical_Language_Models | ---
license: apache-2.0
tags:
- medical
size_categories:
- n<1K
---
# A Comparative Study of Open-Source Large Language Models
## Dataset Overview
Welcome to the dataset repository for our paper, "A Comparative Study of Open-Source Large Language Models, GPT-4 and Claude 2: Multiple-Choice Test Taking in Nephrology." The preprint of the paper can be accessed [here](https://arxiv.org/abs/2308.04709).
## Files
This repository contains two key files:
1. **NEJM_All_Questions_And_Answers.csv**: This file includes all the questions and corresponding answers used in the study.
2. **Ground_Truth_Answers.csv**: This file provides ground truth explanations associated with the questions in the main dataset.
## Usage
To utilize this dataset for your research or experimentation:
1. **Download**: Obtain the dataset files from this repository.
2. **Load**: Import the dataset into your preferred data analysis or machine learning environment.
3. **Explore**: Investigate the questions, answers, and ground truth explanations for your specific use case.
## Paper
Our paper is accepted to NEJM-AI. For now please read the pre-print at the link: https://arxiv.org/abs/2308.04709
|
Falah/2M_creature_animales_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1442174977
num_examples: 2000000
download_size: 185553918
dataset_size: 1442174977
---
# Dataset Card for "2M_creature_animales_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haturusinghe/sold-llama2-1k_v2 | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5405878
num_examples: 6000
- name: val
num_bytes: 1324440
num_examples: 1500
- name: test
num_bytes: 2246228
num_examples: 2500
download_size: 3070684
dataset_size: 8976546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
alberto2/LLamaVoz | ---
license: llama2
---
|
Elfsong/seven_cups | ---
configs:
- config_name: default
data_files:
- split: anxiety
path: data/anxiety-*
- split: bipolar
path: data/bipolar-*
- split: depression
path: data/depression-*
- split: personalitydisorders
path: data/personalitydisorders-*
- split: trauma
path: data/trauma-*
- split: eds
path: data/eds-*
- split: substanceaddiction
path: data/substanceaddiction-*
- split: relationships
path: data/relationships-*
dataset_info:
features:
- name: lead_post
struct:
- name: author
dtype: string
- name: content
dtype: string
- name: date
dtype: string
- name: thread_id
dtype: string
- name: title
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: comment_posts
list:
- name: author
dtype: string
- name: content
dtype: string
- name: parent_ids
sequence: string
- name: post_id
dtype: string
- name: thread_id
dtype: string
- name: url
dtype: string
splits:
- name: anxiety
num_bytes: 24332055
num_examples: 7948
- name: bipolar
num_bytes: 3496018
num_examples: 1033
- name: depression
num_bytes: 59927557
num_examples: 10243
- name: personalitydisorders
num_bytes: 9791687
num_examples: 1854
- name: trauma
num_bytes: 53211657
num_examples: 5763
- name: eds
num_bytes: 9837092
num_examples: 2382
- name: substanceaddiction
num_bytes: 1957813
num_examples: 687
- name: relationships
num_bytes: 56187112
num_examples: 12652
download_size: 94273903
dataset_size: 218740991
---
# Dataset Card for "seven_cups"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yarizui_sen_bento | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yarizui Sen
This is the dataset of Yarizui Sen, containing 198 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 198 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 420 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 198 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 198 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 198 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 198 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 198 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 420 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 420 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 420 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp | ---
pretty_name: Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T16:53:19.272337](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp/blob/main/results_2023-12-09T16-53-19.272337.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389439642939008,\n\
\ \"acc_stderr\": 0.03231020427870188,\n \"acc_norm\": 0.6389579295248086,\n\
\ \"acc_norm_stderr\": 0.03297676323880707,\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5522545162562386,\n\
\ \"mc2_stderr\": 0.015322345793520823\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759093,\n\
\ \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718164\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n\
\ \"acc_stderr\": 0.004743808792037863,\n \"acc_norm\": 0.8450507866958773,\n\
\ \"acc_norm_stderr\": 0.0036111673029597833\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179602,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179602\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787684,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786547,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786547\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5522545162562386,\n\
\ \"mc2_stderr\": 0.015322345793520823\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205083\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.01264354476287336\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|arc:challenge|25_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|gsm8k|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hellaswag|10_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T16-53-19.272337.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- '**/details_harness|winogrande|5_2023-12-09T16-53-19.272337.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T16-53-19.272337.parquet'
- config_name: results
data_files:
- split: 2023_12_09T16_53_19.272337
path:
- results_2023-12-09T16-53-19.272337.parquet
- split: latest
path:
- results_2023-12-09T16-53-19.272337.parquet
---
# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:53:19.272337](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp/blob/main/results_2023-12-09T16-53-19.272337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389439642939008,
"acc_stderr": 0.03231020427870188,
"acc_norm": 0.6389579295248086,
"acc_norm_stderr": 0.03297676323880707,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5522545162562386,
"mc2_stderr": 0.015322345793520823
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759093,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718164
},
"harness|hellaswag|10": {
"acc": 0.6550487950607449,
"acc_stderr": 0.004743808792037863,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.0036111673029597833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179602,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787684,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786547,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5522545162562386,
"mc2_stderr": 0.015322345793520823
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205083
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.01264354476287336
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
loubnabnl/stories_oh_problem | ---
dataset_info:
features:
- name: prompt_problem_solving_story
dtype: string
- name: category
dtype: 'null'
- name: completion
dtype: string
- name: token_length
dtype: int64
splits:
- name: train
num_bytes: 21376595
num_examples: 5000
download_size: 12467295
dataset_size: 21376595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AI-Secure/ChatScene-v1 | ---
license: cc
task_categories:
- text-to-image
- text-to-video
language:
- en
size_categories:
- n<1K
---
# Video and Key Frame Data
## Description
This repository contains video data, extracted key frames, and associated metadata for a collection of scenarios each corresponding to different behaviors in a simulation environment.
## Directory Structure
- `video/`: This directory holds the original MP4 video files organized by scenario and behavior. We provide around 40 mp4 for the same scenario and behavior pair with different routes, speeds, surrounding environments.
- `key_frames/`: Here, five key frames extracted from each video are stored. They are organized into folders mirroring the structure of the `video/` directory.
- `scenario_descriptions.csv`: This file provides word descriptions of each scene in video.
- `video_statistics.csv`: This file contains statistics extracted from the videos, including details like velocity, acceleration, collision situation for each frame on the corresponding mp4.
## Usage
The videos can be used to analyze the behavior in each scenario. The key frames provide quick snapshots of the scenarios at different time intervals, which can be used for further analysis or for generating thumbnails.
## Scripts
- `extract_frames.py`: A Python script used to extract key frames from the videos. |
macadeliccc/distilabel-code-instructions | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 226140
num_examples: 2200
download_size: 80198
dataset_size: 226140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- distilabel
---
|
schibsted/recsys-slates-dataset | ---
license: apache-2.0
---
|
rcds/wikipedia-persons-masked | ---
annotations_creators:
- other
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: null
pretty_name: "wikipedia persons masked: A filtered version of the wikipedia dataset, with only pages of people."
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- fill-mask
---
# wikipedia persons masked: A filtered version of the wikipedia dataset, with only pages of people
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Contains ~70k pages from wikipedia, each describing a person. For each page, the person described in the text
is masked with a <mask> token. The ground truth for every mask is provided.
### Supported Tasks and Leaderboards
The dataset supports the tasks of fill-mask, but can also be used for other tasks such as question answering,
e.g. "Who is <mask>?"
### Languages
*english only*
## Dataset Structure
There is one large dataset file (dataset.jsonl.xz), containing all data.
Use the dataset like this:
```python
from datasets import load_dataset
dataset = load_dataset('rcds/wikipedia-persons-masked')
```
### Data Fields
Columns are:
- id: the id in the original dataset
- url: the link to the wikipedia page
- title: the title of the wikipedia page
- text: the original wikipedia text
- sentences: text split to sentences
- paraphrased_sentences: text split to sentences, with each sentence paraphrased (e.g. mutated a bit)
- masked_text_original: original text with entity masked in every occurence (
- masked_entities_original: array of entities masked in masked_text_original
- masked_text_paraphrased: paraphrased text with entity masked in every occurence
- masked_entities_paraphrased: array of entities msked in masked_text_paraphrased
### Data Splits
There are no splits.
## Dataset Creation
This dataset was created by using the wikipedia dataset from huggingface and processing it from there.
People were queried via wikidata. The texts were split with nltk punkt, paraphrased with tuner007's pegasus.
The entity recognition was performed with bert-base-NER by dslim and recognized entities replaced with a mask token.
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
TODO add citation
```
### Contributions
Thanks to [@skatinger](https://github.com/skatinger) for adding this dataset. |
Aniemore/cedr-m7 | ---
annotations_creators:
- found
language_creators:
- found
language:
- ru
license: mit
multilinguality:
- monolingual
pretty_name: cedr-m7
size_categories:
- 1K<n<10K
source_datasets:
- extended|cedr
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for CEDR-M7
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{Aniemore,
author = {Артем Аментес, Илья Лубенец, Никита Давидчук},
title = {Открытая библиотека искусственного интеллекта для анализа и выявления эмоциональных оттенков речи человека},
year = {2022},
publisher = {Hugging Face},
journal = {Hugging Face Hub},
howpublished = {\url{https://huggingface.com/aniemore/Aniemore}},
email = {hello@socialcode.ru}
}
```
### Contributions
Thanks to [@toiletsandpaper](https://github.com/toiletsandpaper) for adding this dataset.
|
Yemmy1000/cybersec_embedding_llama_chat | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
splits:
- name: train
num_bytes: 5951997
num_examples: 7697
download_size: 2761782
dataset_size: 5951997
---
# Dataset Card for "cybersec_embedding_llama_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CptNemo/small-shakespear-sonets-1 | ---
license: apache-2.0
---
This dataset is collection of Shakespear sonnet's, with a query for LLM. |
thewall/PLBS | ---
license: apache-2.0
---
|
m44rcus/gsplats | ---
license: cc-by-sa-4.0
---
|
fdawd/21 | ---
license: zlib
---
|
kanishka/counterfactual-babylm-pipps_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581830554
num_examples: 11632119
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421726778
dataset_size: 637950784
---
# Dataset Card for "counterfactual-babylm-pipps_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nerfgun3/dpin_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/dpin_style/resolve/main/dpin_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Dpin Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/dpin_style/resolve/main/dpin_showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"dpin_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(dpin_style:0.8)"```
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
open-llm-leaderboard/details_automerger__PasticheInex12-7B | ---
pretty_name: Evaluation run of automerger/PasticheInex12-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/PasticheInex12-7B](https://huggingface.co/automerger/PasticheInex12-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__PasticheInex12-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T18:20:31.992956](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__PasticheInex12-7B/blob/main/results_2024-04-02T18-20-31.992956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653961766087049,\n\
\ \"acc_stderr\": 0.03214573713215449,\n \"acc_norm\": 0.6532931916719209,\n\
\ \"acc_norm_stderr\": 0.03282064825819239,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7755797894971617,\n\
\ \"mc2_stderr\": 0.013842931270009715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.013131238126975578,\n\
\ \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.012849054826858108\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7216689902409879,\n\
\ \"acc_stderr\": 0.00447261314850891,\n \"acc_norm\": 0.8924517028480382,\n\
\ \"acc_norm_stderr\": 0.0030917590945195366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863935,\n\
\ \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863935\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"\
acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7755797894971617,\n\
\ \"mc2_stderr\": 0.013842931270009715\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \
\ \"acc_stderr\": 0.012774285669385089\n }\n}\n```"
repo_url: https://huggingface.co/automerger/PasticheInex12-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-20-31.992956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-20-31.992956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- '**/details_harness|winogrande|5_2024-04-02T18-20-31.992956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T18-20-31.992956.parquet'
- config_name: results
data_files:
- split: 2024_04_02T18_20_31.992956
path:
- results_2024-04-02T18-20-31.992956.parquet
- split: latest
path:
- results_2024-04-02T18-20-31.992956.parquet
---
# Dataset Card for Evaluation run of automerger/PasticheInex12-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/PasticheInex12-7B](https://huggingface.co/automerger/PasticheInex12-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__PasticheInex12-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T18:20:31.992956](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__PasticheInex12-7B/blob/main/results_2024-04-02T18-20-31.992956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653961766087049,
"acc_stderr": 0.03214573713215449,
"acc_norm": 0.6532931916719209,
"acc_norm_stderr": 0.03282064825819239,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7755797894971617,
"mc2_stderr": 0.013842931270009715
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.013131238126975578,
"acc_norm": 0.7380546075085325,
"acc_norm_stderr": 0.012849054826858108
},
"harness|hellaswag|10": {
"acc": 0.7216689902409879,
"acc_stderr": 0.00447261314850891,
"acc_norm": 0.8924517028480382,
"acc_norm_stderr": 0.0030917590945195366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7755797894971617,
"mc2_stderr": 0.013842931270009715
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385089
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
GiovanniHD/AmiMizuno | ---
license: openrail
---
|
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_A_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 42137
num_examples: 100
download_size: 0
dataset_size: 42137
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_A_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharma-IA/test_flagging | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Ksingleton/KBase_SDK_Docs | ---
license: apache-2.0
---
|
ntt123/viet-tts-dataset | ---
license: cc-by-nc-4.0
---
# Vietnamese Text-To-Speech dataset (VietTTS-v1.1)
🔔🔔🔔 visit https://github.com/NTT123/vietTTS for a vietnamese TTS library (included pretrained models). 🔔🔔🔔
The text is from a collection of novels and short stories from the author "Vu Trong Phung." The text is in public domain.
The audio is generated by Google Text-to-Speech offline engine on Android. The audio is NOT for commercial use.
Dataset size: `5.4G`.
Total audio duration: `35.9 hours`.
### Text-audio samples
- Sample 1:
+ Audio: [file1](https://huggingface.co/datasets/ntt123/viet-tts-dataset/blob/main/000000.wav)
+ Text: `"Ai" đây tức là một kẻ ăn mày vậy. Anh ta chưa kịp quay đi thì đã thấy mấy con chó vàng chạy xồng xộc ra cứ nhảy xổ vào chân anh.`
- Sample 2:
+ Audio: [file2](https://huggingface.co/datasets/ntt123/viet-tts-dataset/blob/main/022878.wav)
+ Text: `Ừ, thế mày đã nuôi được bố mẹ mày bữa nào chưa, hay xưa nay vẫn báo hại cơm cha áo mẹ mãi? Mấy hôm thấy ông đơ mặt không thèm nói, mày lại làm già à?`
### Download
Get the dataset from here: [link](https://huggingface.co/datasets/ntt123/viet-tts-dataset/blob/main/viet-tts.tar.gz).
Or, run the following commands:
```
wget https://huggingface.co/datasets/ntt123/viet-tts-dataset/resolve/main/viet-tts.tar.gz -O viet-tts.tar.gz
mkdir -p dataset
tar -C dataset -xzf viet-tts.tar.gz
```
`dataset` directory structure:
```
dataset
├── collections.txt
├── meta_data.tsv
└── wav
├── 000000.wav
├── 000001.wav
├── 000002.wav
├── 000003.wav
...
```
### Statistics
- Number of clips: 22884 clips.
- Shortest audio clip: 0.46 seconds.
- Median clip duration: 5.46 seconds.
- Mean clip duration: 5.65 seconds.
- Longest audio clip: 15.4 seconds.
### Vũ Trọng Phụng's collections
- Bệnh Lao Chữa Bằng Mồm Hay Là ... Thầy Lang Bất Hủ, 1934?
- Cạm Bẫy Người, 1933.
- Cơm Thầy Cơm Cô, 1936.
- Đời Là Một Cuộc Chiến Đấu,1939.
- Dứt Tình, 1934.
- Giông Tố, 1936.
- Gương Tống Tiền, N/A.
- Hồ Sê Líu, Hồ Líu Sê Sàng, 1936.
- Kỹ Nghệ Lấy Tây, 1934.
- Làm Đĩ, 1936.
- Lấy Nhau Vì Tình, 1937.
- Lấy Vợ Xấu, 1937.
- Lòng Tự Ái, 1937.
- Máu Mê, 1937.
- Một Cái Chết, 1931.
- Một Con Chó Hay Chim Chuột, 1937.
- Một Đồng Bạc, 1939.
- Người Có Quyền, 1937.
- Sao Mày Không Vỡ Nắp Ơi!, 1934.
- Số Đỏ, 1936.
- Sư Cụ Triết Lý, 1935.
- Trúng Số Độc Đắc, 1938.
- Tự Do, 1937.
- Từ Lý Thuyết Đến Thực Hành, N/A.
- Vỡ Đê, 1936.
|
primate88/kek-fpep | ---
license: apache-2.0
language:
- en
tags:
- not-for-all-audiences
task_categories:
- text-generation
size_categories:
- 1M<n<10M
---
# kek-fpep
first post easiest post |
autoevaluate/autoeval-eval-squad_v2-squad_v2-8571ec-1652758615 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: Sangita/distilbert-base-uncased-finetuned-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Sangita/distilbert-base-uncased-finetuned-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
vinisebk/jc_chasez | ---
license: openrail
---
|
nala-cub/americas_nli | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- ay
- bzd
- cni
- gn
- hch
- nah
- oto
- qu
- shp
- tar
license: cc-by-sa-4.0
multilinguality:
- multilingual
- translation
size_categories:
- unknown
source_datasets:
- extended|xnli
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: 'AmericasNLI: A NLI Corpus of 10 Indigenous Low-Resource Languages.'
dataset_info:
- config_name: all_languages
features:
- name: language
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 1129080
num_examples: 6457
- name: test
num_bytes: 1210579
num_examples: 7486
download_size: 791239
dataset_size: 2339659
- config_name: aym
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 117530
num_examples: 743
- name: test
num_bytes: 115251
num_examples: 750
download_size: 87882
dataset_size: 232781
- config_name: bzd
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 143354
num_examples: 743
- name: test
num_bytes: 127676
num_examples: 750
download_size: 91039
dataset_size: 271030
- config_name: cni
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 113256
num_examples: 658
- name: test
num_bytes: 116284
num_examples: 750
download_size: 78899
dataset_size: 229540
- config_name: gn
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 115135
num_examples: 743
- name: test
num_bytes: 101948
num_examples: 750
download_size: 80429
dataset_size: 217083
- config_name: hch
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 127966
num_examples: 743
- name: test
num_bytes: 120857
num_examples: 750
download_size: 90748
dataset_size: 248823
- config_name: nah
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 50741
num_examples: 376
- name: test
num_bytes: 102953
num_examples: 738
download_size: 56953
dataset_size: 153694
- config_name: oto
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 27010
num_examples: 222
- name: test
num_bytes: 119650
num_examples: 748
download_size: 57849
dataset_size: 146660
- config_name: quy
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 125636
num_examples: 743
- name: test
num_bytes: 112750
num_examples: 750
download_size: 85673
dataset_size: 238386
- config_name: shp
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 124500
num_examples: 743
- name: test
num_bytes: 118934
num_examples: 750
download_size: 85544
dataset_size: 243434
- config_name: tar
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: validation
num_bytes: 139496
num_examples: 743
- name: test
num_bytes: 122624
num_examples: 750
download_size: 89683
dataset_size: 262120
configs:
- config_name: all_languages
data_files:
- split: validation
path: all_languages/validation-*
- split: test
path: all_languages/test-*
- config_name: aym
data_files:
- split: validation
path: aym/validation-*
- split: test
path: aym/test-*
- config_name: bzd
data_files:
- split: validation
path: bzd/validation-*
- split: test
path: bzd/test-*
- config_name: cni
data_files:
- split: validation
path: cni/validation-*
- split: test
path: cni/test-*
- config_name: gn
data_files:
- split: validation
path: gn/validation-*
- split: test
path: gn/test-*
- config_name: hch
data_files:
- split: validation
path: hch/validation-*
- split: test
path: hch/test-*
- config_name: nah
data_files:
- split: validation
path: nah/validation-*
- split: test
path: nah/test-*
- config_name: oto
data_files:
- split: validation
path: oto/validation-*
- split: test
path: oto/test-*
- config_name: quy
data_files:
- split: validation
path: quy/validation-*
- split: test
path: quy/test-*
- config_name: shp
data_files:
- split: validation
path: shp/validation-*
- split: test
path: shp/test-*
- config_name: tar
data_files:
- split: validation
path: tar/validation-*
- split: test
path: tar/test-*
---
# Dataset Card for AmericasNLI
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** https://github.com/abteen/americasnli
- **Repository:** https://github.com/nala-cub/AmericasNLI
- **Paper:** https://arxiv.org/abs/2104.08726
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
AmericasNLI is an extension of XNLI (Conneau et al., 2018) a natural language inference (NLI) dataset covering 15 high-resource languages to 10 low-resource indigenous languages spoken in the Americas: Ashaninka, Aymara, Bribri, Guarani, Nahuatl, Otomi, Quechua, Raramuri, Shipibo-Konibo, and Wixarika. As with MNLI, the goal is to predict textual entailment (does sentence A imply/contradict/neither sentence B) and is a classification task (given two sentences, predict one of three labels).
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
- aym
- bzd
- cni
- gn
- hch
- nah
- oto
- quy
- shp
- tar
## Dataset Structure
### Data Instances
#### all_languages
An example of the test split looks as follows:
```
{'language': 'aym', 'premise': "Ukhamaxa, janiw ukatuqits lup'kayätti, ukhamarus wali phiñasitayätwa, ukatx jupampiw mayamp aruskipañ qallanttha.", 'hypothesis': 'Janiw mayamp jupampix p
arlxapxti.', 'label': 2}
```
#### aym
An example of the test split looks as follows:
```
{'premise': "Ukhamaxa, janiw ukatuqits lup'kayätti, ukhamarus wali phiñasitayätwa, ukatx jupampiw mayamp aruskipañ qallanttha.", 'hypothesis': 'Janiw mayamp jupampix parlxapxti.', 'label
': 2}
```
#### bzd
An example of the test split looks as follows:
```
{'premise': "Bua', kèq ye' kũ e' bikeitsök erë ye' chkénãwã tã ye' ujtémĩne ie' tã páxlĩnẽ.", 'hypothesis': "Kèq ye' ùtẽnẽ ie' tã páxlĩ.", 'label': 2}
```
#### cni
An example of the test split looks as follows:
```
{'premise': 'Kameetsa, tee nokenkeshireajeroji, iro kantaincha tee nomateroji aisati nintajaro noñanatajiri iroakera.', 'hypothesis': 'Tee noñatajeriji.', 'label': 2}
```
#### gn
An example of the test split looks as follows:
```
{'premise': "Néi, ni napensaikurihína upéva rehe, ajepichaiterei ha añepyrûjey añe'ê hendive.", 'hypothesis': "Nañe'êvéi hendive.", 'label': 2}
```
#### hch
An example of the test split looks as follows:
```
{'premise': 'mu hekwa.', 'hypothesis': 'neuka tita xatawe m+k+ mat+a.', 'label': 2}
```
#### nah
An example of the test split looks as follows:
```
{'premise': 'Cualtitoc, na axnimoihliaya ino, nicualaniztoya queh naha nicamohuihqui', 'hypothesis': 'Ayoc nicamohuihtoc', 'label': 2}
```
#### oto
An example of the test split looks as follows:
```
{'premise': 'mi-ga, nin mibⴘy mbô̮nitho ane guenu, guedi mibⴘy nho ⴘnmⴘy xi di mⴘdi o ñana nen nⴘua manaigui', 'hypothesis': 'hin din bi pengui nen nⴘa', 'label': 2}
```
#### quy
An example of the test split looks as follows:
``` {'premise': 'Allinmi, manam chaypiqa hamutachkarqanichu, ichaqa manam allinchu tarikurqani chaymi kaqllamanta paywan rimarqani.', 'hypothesis': 'Manam paywanqa kaqllamantaqa rimarqani
.', 'label': 2}
```
#### shp
An example of the test split looks as follows:
```
{'premise': 'Jakon riki, ja shinanamara ea ike, ikaxbi kikin frustradara ea ike jakopira ea jabe yoyo iribake.', 'hypothesis': 'Eara jabe yoyo iribiama iki.', 'label': 2}
```
#### tar
An example of the test split looks as follows:
```
{'premise': 'Ga’lá ju, ke tási newalayé nejé echi kítira, we ne majáli, a’lí ko uchécho ne yua ku ra’íchaki.', 'hypothesis': 'Tási ne uchecho yua ra’ícha échi rejói.', 'label': 2}
```
### Data Fields
#### all_languages
- language: a multilingual string variable, with languages including ar, bg, de, el, en.
- premise: a multilingual string variable, with languages including ar, bg, de, el, en.
- hypothesis: a multilingual string variable, with possible languages including ar, bg, de, el, en.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### aym
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### bzd
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### cni
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### hch
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### nah
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### oto
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### quy
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### shp
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
#### tar
- premise: a string feature.
- hypothesis: a string feature.
- label: a classification label, with possible values including entailment (0), neutral (1), contradiction (2).
### Data Splits
| Language | ISO | Family | Dev | Test |
|-------------------|-----|:-------------|-----:|-----:|
| all_languages | -- | -- | 6457 | 7486 |
| Aymara | aym | Aymaran | 743 | 750 |
| Ashaninka | cni | Arawak | 658 | 750 |
| Bribri | bzd | Chibchan | 743 | 750 |
| Guarani | gn | Tupi-Guarani | 743 | 750 |
| Nahuatl | nah | Uto-Aztecan | 376 | 738 |
| Otomi | oto | Oto-Manguean | 222 | 748 |
| Quechua | quy | Quechuan | 743 | 750 |
| Raramuri | tar | Uto-Aztecan | 743 | 750 |
| Shipibo-Konibo | shp | Panoan | 743 | 750 |
| Wixarika | hch | Uto-Aztecan | 743 | 750 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
The authors translate from the Spanish subset of XNLI.
> AmericasNLI is the translation of a subset of XNLI (Conneau et al., 2018). As translators between Spanish and the target languages are more frequently available than those for English, we translate from the Spanish version.
As per paragraph 3.1 of the [original paper](https://arxiv.org/abs/2104.08726).
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
The dataset comprises expert translations from Spanish XNLI.
> Additionally, some translators reported that code-switching is often used to describe certain topics, and, while many words without an exact equivalence in the target language are worked in through translation or interpretation, others are kept in Spanish. To minimize the amount of Spanish vocabulary in the translated examples, we choose sentences from genres that we judged to be relatively easy to translate into the target languages: “face-to-face,” “letters,” and “telephone.”
As per paragraph 3.1 of the [original paper](https://arxiv.org/abs/2104.08726).
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Creative Commons Attribution Share Alike 4.0 International: https://github.com/abteen/americasnli/blob/main/LICENSE.md
### Citation Information
```
@inproceedings{ebrahimi-etal-2022-americasnli,
title = "{A}mericas{NLI}: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages",
author = "Ebrahimi, Abteen and
Mager, Manuel and
Oncevay, Arturo and
Chaudhary, Vishrav and
Chiruzzo, Luis and
Fan, Angela and
Ortega, John and
Ramos, Ricardo and
Rios, Annette and
Meza Ruiz, Ivan Vladimir and
Gim{\'e}nez-Lugo, Gustavo and
Mager, Elisabeth and
Neubig, Graham and
Palmer, Alexis and
Coto-Solano, Rolando and
Vu, Thang and
Kann, Katharina",
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.acl-long.435",
pages = "6279--6299",
abstract = "Pretrained multilingual models are able to perform cross-lingual transfer in a zero-shot setting, even for languages unseen during pretraining. However, prior work evaluating performance on unseen languages has largely been limited to low-level, syntactic tasks, and it remains unclear if zero-shot learning of high-level, semantic tasks is possible for unseen languages. To explore this question, we present AmericasNLI, an extension of XNLI (Conneau et al., 2018) to 10 Indigenous languages of the Americas. We conduct experiments with XLM-R, testing multiple zero-shot and translation-based approaches. Additionally, we explore model adaptation via continued pretraining and provide an analysis of the dataset by considering hypothesis-only models. We find that XLM-R{'}s zero-shot performance is poor for all 10 languages, with an average performance of 38.48{\%}. Continued pretraining offers improvements, with an average accuracy of 43.85{\%}. Surprisingly, training on poorly translated data by far outperforms all other methods with an accuracy of 49.12{\%}.",
}
```
### Contributions
Thanks to [@fdschmidt93](https://github.com/fdschmidt93) for adding this dataset. |
vwxyzjn/openhermes-dev__meta-llama_Llama-2-70b-chat-hf__1707332943 | ---
dataset_info:
features:
- name: model
dtype: 'null'
- name: category
dtype: string
- name: language
dtype: string
- name: custom_instruction
dtype: bool
- name: id
dtype: string
- name: topic
dtype: string
- name: avatarUrl
dtype: 'null'
- name: idx
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: string
- name: source
dtype: string
- name: model_name
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: string
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: prompt
dtype: string
- name: token_length
dtype: int64
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1_policy
dtype: string
- name: candidate0_score
dtype: float64
- name: candidate1_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 2299898
num_examples: 167
download_size: 1363484
dataset_size: 2299898
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
|
ashraq/ott-qa-20k | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: header
sequence: string
- name: data
sequence:
sequence: string
- name: section_title
dtype: string
- name: section_text
dtype: string
- name: uid
dtype: string
- name: intro
dtype: string
splits:
- name: train
num_bytes: 41038376
num_examples: 20000
download_size: 23329221
dataset_size: 41038376
---
# Dataset Card for "ott-qa-20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
The data was obtained from [here](https://github.com/wenhuchen/OTT-QA) |
sajjadrauf/tolokaVQA | ---
license: other
---
|
sooks/id1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AI
'1': Human
splits:
- name: train
num_bytes: 2034793520.8400002
num_examples: 301359
- name: test
num_bytes: 358763931.71400005
num_examples: 53181
download_size: 2387343160
dataset_size: 2393557452.5540004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yjernite/prof_report__SD_v2_random_seeds__sd_21__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: paralegal
num_bytes: 3552
num_examples: 8
- name: bartender
num_bytes: 3480
num_examples: 5
- name: facilities_manager
num_bytes: 3624
num_examples: 11
- name: accountant
num_bytes: 3600
num_examples: 10
- name: graphic_designer
num_bytes: 3504
num_examples: 6
- name: network_administrator
num_bytes: 3552
num_examples: 8
- name: financial_manager
num_bytes: 3576
num_examples: 9
- name: baker
num_bytes: 3672
num_examples: 13
- name: security_guard
num_bytes: 3528
num_examples: 7
- name: artist
num_bytes: 3696
num_examples: 14
- name: author
num_bytes: 3504
num_examples: 6
- name: printing_press_operator
num_bytes: 3768
num_examples: 17
- name: public_relations_specialist
num_bytes: 3576
num_examples: 9
- name: sheet_metal_worker
num_bytes: 3624
num_examples: 11
- name: clergy
num_bytes: 3624
num_examples: 11
- name: payroll_clerk
num_bytes: 3552
num_examples: 8
- name: teller
num_bytes: 3672
num_examples: 13
- name: real_estate_broker
num_bytes: 3456
num_examples: 4
- name: customer_service_representative
num_bytes: 3576
num_examples: 9
- name: painter
num_bytes: 3696
num_examples: 14
- name: tractor_operator
num_bytes: 3504
num_examples: 6
- name: dental_hygienist
num_bytes: 3504
num_examples: 6
- name: industrial_engineer
num_bytes: 3480
num_examples: 5
- name: electrician
num_bytes: 3456
num_examples: 4
- name: head_cook
num_bytes: 3648
num_examples: 12
- name: health_technician
num_bytes: 3624
num_examples: 11
- name: carpet_installer
num_bytes: 3456
num_examples: 4
- name: purchasing_agent
num_bytes: 3528
num_examples: 7
- name: supervisor
num_bytes: 3624
num_examples: 11
- name: civil_engineer
num_bytes: 3504
num_examples: 6
- name: lawyer
num_bytes: 3600
num_examples: 10
- name: language_pathologist
num_bytes: 3720
num_examples: 15
- name: ceo
num_bytes: 3528
num_examples: 7
- name: computer_support_specialist
num_bytes: 3600
num_examples: 10
- name: postal_worker
num_bytes: 3720
num_examples: 15
- name: mechanical_engineer
num_bytes: 3480
num_examples: 5
- name: nursing_assistant
num_bytes: 3600
num_examples: 10
- name: dentist
num_bytes: 3480
num_examples: 5
- name: tutor
num_bytes: 3696
num_examples: 14
- name: butcher
num_bytes: 3576
num_examples: 9
- name: insurance_agent
num_bytes: 3528
num_examples: 7
- name: courier
num_bytes: 3600
num_examples: 10
- name: computer_programmer
num_bytes: 3480
num_examples: 5
- name: truck_driver
num_bytes: 3528
num_examples: 7
- name: mechanic
num_bytes: 3480
num_examples: 5
- name: marketing_manager
num_bytes: 3528
num_examples: 7
- name: sales_manager
num_bytes: 3504
num_examples: 6
- name: correctional_officer
num_bytes: 3648
num_examples: 12
- name: manager
num_bytes: 3528
num_examples: 7
- name: underwriter
num_bytes: 3600
num_examples: 10
- name: executive_assistant
num_bytes: 3528
num_examples: 7
- name: designer
num_bytes: 3504
num_examples: 6
- name: groundskeeper
num_bytes: 3504
num_examples: 6
- name: mental_health_counselor
num_bytes: 3672
num_examples: 13
- name: aerospace_engineer
num_bytes: 3504
num_examples: 6
- name: taxi_driver
num_bytes: 3528
num_examples: 7
- name: nurse
num_bytes: 3528
num_examples: 7
- name: data_entry_keyer
num_bytes: 3480
num_examples: 5
- name: musician
num_bytes: 3624
num_examples: 11
- name: event_planner
num_bytes: 3696
num_examples: 14
- name: writer
num_bytes: 3576
num_examples: 9
- name: cook
num_bytes: 3648
num_examples: 12
- name: welder
num_bytes: 3624
num_examples: 11
- name: producer
num_bytes: 3528
num_examples: 7
- name: hairdresser
num_bytes: 3600
num_examples: 10
- name: farmer
num_bytes: 3456
num_examples: 4
- name: construction_worker
num_bytes: 3480
num_examples: 5
- name: air_conditioning_installer
num_bytes: 3456
num_examples: 4
- name: electrical_engineer
num_bytes: 3504
num_examples: 6
- name: occupational_therapist
num_bytes: 3600
num_examples: 10
- name: career_counselor
num_bytes: 3576
num_examples: 9
- name: interior_designer
num_bytes: 3552
num_examples: 8
- name: jailer
num_bytes: 3648
num_examples: 12
- name: office_clerk
num_bytes: 3576
num_examples: 9
- name: market_research_analyst
num_bytes: 3624
num_examples: 11
- name: laboratory_technician
num_bytes: 3648
num_examples: 12
- name: social_assistant
num_bytes: 3576
num_examples: 9
- name: medical_records_specialist
num_bytes: 3576
num_examples: 9
- name: machinery_mechanic
num_bytes: 3456
num_examples: 4
- name: police_officer
num_bytes: 3648
num_examples: 12
- name: software_developer
num_bytes: 3504
num_examples: 6
- name: clerk
num_bytes: 3696
num_examples: 14
- name: salesperson
num_bytes: 3624
num_examples: 11
- name: social_worker
num_bytes: 3648
num_examples: 12
- name: director
num_bytes: 3576
num_examples: 9
- name: fast_food_worker
num_bytes: 3648
num_examples: 12
- name: singer
num_bytes: 3720
num_examples: 15
- name: metal_worker
num_bytes: 3528
num_examples: 7
- name: cleaner
num_bytes: 3696
num_examples: 14
- name: computer_systems_analyst
num_bytes: 3576
num_examples: 9
- name: dental_assistant
num_bytes: 3504
num_examples: 6
- name: psychologist
num_bytes: 3576
num_examples: 9
- name: machinist
num_bytes: 3456
num_examples: 4
- name: therapist
num_bytes: 3600
num_examples: 10
- name: veterinarian
num_bytes: 3528
num_examples: 7
- name: teacher
num_bytes: 3672
num_examples: 13
- name: architect
num_bytes: 3528
num_examples: 7
- name: office_worker
num_bytes: 3504
num_examples: 6
- name: drywall_installer
num_bytes: 3456
num_examples: 4
- name: nutritionist
num_bytes: 3552
num_examples: 8
- name: librarian
num_bytes: 3600
num_examples: 10
- name: childcare_worker
num_bytes: 3600
num_examples: 10
- name: school_bus_driver
num_bytes: 3744
num_examples: 16
- name: file_clerk
num_bytes: 3648
num_examples: 12
- name: logistician
num_bytes: 3504
num_examples: 6
- name: scientist
num_bytes: 3600
num_examples: 10
- name: teaching_assistant
num_bytes: 3552
num_examples: 8
- name: radiologic_technician
num_bytes: 3600
num_examples: 10
- name: manicurist
num_bytes: 3624
num_examples: 11
- name: community_manager
num_bytes: 3552
num_examples: 8
- name: carpenter
num_bytes: 3456
num_examples: 4
- name: claims_appraiser
num_bytes: 3528
num_examples: 7
- name: dispatcher
num_bytes: 3624
num_examples: 11
- name: cashier
num_bytes: 3672
num_examples: 13
- name: roofer
num_bytes: 3456
num_examples: 4
- name: photographer
num_bytes: 3624
num_examples: 11
- name: detective
num_bytes: 3600
num_examples: 10
- name: financial_advisor
num_bytes: 3552
num_examples: 8
- name: wholesale_buyer
num_bytes: 3672
num_examples: 13
- name: it_specialist
num_bytes: 3576
num_examples: 9
- name: pharmacy_technician
num_bytes: 3552
num_examples: 8
- name: engineer
num_bytes: 3480
num_examples: 5
- name: mover
num_bytes: 3600
num_examples: 10
- name: plane_mechanic
num_bytes: 3504
num_examples: 6
- name: interviewer
num_bytes: 3648
num_examples: 12
- name: massage_therapist
num_bytes: 3624
num_examples: 11
- name: dishwasher
num_bytes: 3600
num_examples: 10
- name: fitness_instructor
num_bytes: 3576
num_examples: 9
- name: credit_counselor
num_bytes: 3552
num_examples: 8
- name: stocker
num_bytes: 3672
num_examples: 13
- name: pharmacist
num_bytes: 3576
num_examples: 9
- name: doctor
num_bytes: 3600
num_examples: 10
- name: compliance_officer
num_bytes: 3648
num_examples: 12
- name: aide
num_bytes: 3672
num_examples: 13
- name: bus_driver
num_bytes: 3600
num_examples: 10
- name: financial_analyst
num_bytes: 3528
num_examples: 7
- name: receptionist
num_bytes: 3552
num_examples: 8
- name: janitor
num_bytes: 3576
num_examples: 9
- name: plumber
num_bytes: 3408
num_examples: 2
- name: physical_therapist
num_bytes: 3624
num_examples: 11
- name: inventory_clerk
num_bytes: 3528
num_examples: 7
- name: firefighter
num_bytes: 3600
num_examples: 10
- name: coach
num_bytes: 3600
num_examples: 10
- name: maid
num_bytes: 3672
num_examples: 13
- name: pilot
num_bytes: 3576
num_examples: 9
- name: repair_worker
num_bytes: 3576
num_examples: 9
download_size: 868129
dataset_size: 522024
---
# Dataset Card for "prof_report__SD_v2_random_seeds__sd_21__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_she_inanimate_objects | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13331
num_examples: 61
- name: test
num_bytes: 7558
num_examples: 43
- name: train
num_bytes: 25416
num_examples: 119
download_size: 40566
dataset_size: 46305
---
# Dataset Card for "MULTI_VALUE_stsb_she_inanimate_objects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rec456/vozcillianmurphy | ---
license: openrail
---
|
open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.2 | ---
pretty_name: Evaluation run of NovoCode/Mistral-NeuralDPO-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NovoCode/Mistral-NeuralDPO-v0.2](https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T06:05:02.538457](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.2/blob/main/results_2024-02-19T06-05-02.538457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6262346035274842,\n\
\ \"acc_stderr\": 0.03258054476573943,\n \"acc_norm\": 0.6312971512345912,\n\
\ \"acc_norm_stderr\": 0.03324902767301004,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.48726838826281,\n\
\ \"mc2_stderr\": 0.015809965700715564\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038075,\n\
\ \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6564429396534555,\n\
\ \"acc_stderr\": 0.0047392481181180056,\n \"acc_norm\": 0.8501294562836088,\n\
\ \"acc_norm_stderr\": 0.003562149890962712\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623996,\n \"\
acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094774,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094774\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391534,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594206,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594206\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976269,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976269\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\
\ \"acc_stderr\": 0.012704030518851491,\n \"acc_norm\": 0.4491525423728814,\n\
\ \"acc_norm_stderr\": 0.012704030518851491\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.0196438015579248,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.0196438015579248\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401712,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401712\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.48726838826281,\n\
\ \"mc2_stderr\": 0.015809965700715564\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36694465504169826,\n \
\ \"acc_stderr\": 0.013275883047712213\n }\n}\n```"
repo_url: https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|arc:challenge|25_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|gsm8k|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hellaswag|10_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T06-05-02.538457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T06-05-02.538457.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- '**/details_harness|winogrande|5_2024-02-19T06-05-02.538457.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T06-05-02.538457.parquet'
- config_name: results
data_files:
- split: 2024_02_19T06_05_02.538457
path:
- results_2024-02-19T06-05-02.538457.parquet
- split: latest
path:
- results_2024-02-19T06-05-02.538457.parquet
---
# Dataset Card for Evaluation run of NovoCode/Mistral-NeuralDPO-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Mistral-NeuralDPO-v0.2](https://huggingface.co/NovoCode/Mistral-NeuralDPO-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T06:05:02.538457](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Mistral-NeuralDPO-v0.2/blob/main/results_2024-02-19T06-05-02.538457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6262346035274842,
"acc_stderr": 0.03258054476573943,
"acc_norm": 0.6312971512345912,
"acc_norm_stderr": 0.03324902767301004,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.48726838826281,
"mc2_stderr": 0.015809965700715564
},
"harness|arc:challenge|25": {
"acc": 0.6382252559726962,
"acc_stderr": 0.014041957945038075,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6564429396534555,
"acc_stderr": 0.0047392481181180056,
"acc_norm": 0.8501294562836088,
"acc_norm_stderr": 0.003562149890962712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094774,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391534,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594206,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594206
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976269,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976269
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851491,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851491
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.0196438015579248,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.0196438015579248
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401712,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401712
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.48726838826281,
"mc2_stderr": 0.015809965700715564
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712213
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lin-Chen/MMStar | ---
task_categories:
- multiple-choice
- question-answering
- visual-question-answering
language:
- en
size_categories:
- 1K<n<10K
configs:
- config_name: val
data_files:
- split: val
path: "mmstar.parquet"
dataset_info:
- config_name: val
features:
- name: index
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: answer
dtype: string
- name: category
dtype: string
- name: l2_category
dtype: string
- name: meta_info
struct:
- name: source
dtype: string
- name: split
dtype: string
- name: image_path
dtype: string
splits:
- name: val
num_bytes: 44831593
num_examples: 1500
---
# MMStar (Are We on the Right Way for Evaluating Large Vision-Language Models?)
[**🌐 Homepage**](https://mmstar-benchmark.github.io/) | [**🤗 Dataset**](https://huggingface.co/datasets/Lin-Chen/MMStar) | [**🤗 Paper**](https://huggingface.co/papers/2403.20330) | [**📖 arXiv**](https://arxiv.org/pdf/2403.20330.pdf) | [**GitHub**](https://github.com/MMStar-Benchmark/MMStar)
## Dataset Details
As shown in the figure below, existing benchmarks lack consideration of the vision dependency of evaluation samples and potential data leakage from LLMs' and LVLMs' training data.
<p align="center">
<img src="https://raw.githubusercontent.com/MMStar-Benchmark/MMStar/main/resources/4_case_in_1.png" width="80%"> <br>
</p>
Therefore, we introduce MMStar: an elite vision-indispensible multi-modal benchmark, aiming to ensure each curated sample exhibits **visual dependency**, **minimal data leakage**, and **requires advanced multi-modal capabilities**.
🎯 **We have released a full set comprising 1500 offline-evaluating samples.** After applying the coarse filter process and manual review, we narrow down from a total of 22,401 samples to 11,607 candidate samples and finally select 1,500 high-quality samples to construct our MMStar benchmark.
<p align="center">
<img src="https://raw.githubusercontent.com/MMStar-Benchmark/MMStar/main/resources/data_source.png" width="80%"> <br>
</p>
In MMStar, we display **6 core capabilities** in the inner ring, with **18 detailed axes** presented in the outer ring. The middle ring showcases the number of samples for each detailed dimension. Each core capability contains a meticulously **balanced 250 samples**. We further ensure a relatively even distribution across the 18 detailed axes.
<p align="center">
<img src="https://raw.githubusercontent.com/MMStar-Benchmark/MMStar/main/resources/mmstar.png" width="60%"> <br>
</p>
## 🏆 Mini-Leaderboard
We show a mini-leaderboard here and please find more information in our paper or [homepage](https://mmstar-benchmark.github.io/).
| Model | Acc. | MG ⬆ | ML ⬇ |
|----------------------------|:---------:|:------------:|:------------:|
| GPT4V (high)| **57.1** | **43.6** | 1.3 |
| InternLM-Xcomposer2| 55.4 | 28.1 | 7.5|
| LLaVA-Next-34B |52.1|29.4|2.4|
|GPT4V (low)|46.1|32.6|1.3|
|InternVL-Chat-v1.2|43.7|32.6|**0.0**|
|GeminiPro-Vision|42.6|27.4|**0.0**|
|Sphinx-X-MoE|38.9|14.8|1.0|
|Monkey-Chat|38.3|13.5|17.6|
|Yi-VL-6B|37.9|15.6|**0.0**|
|Qwen-VL-Chat|37.5|23.9|**0.0**|
|Deepseek-VL-7B|37.1|15.7|**0.0**|
|CogVLM-Chat|36.5|14.9|**0.0**|
|Yi-VL-34B|36.1|18.8|**0.0**|
|TinyLLaVA|36.0|16.4|7.6|
|ShareGPT4V-7B|33.0|11.9|**0.0**|
|LLaVA-1.5-13B|32.8|13.9|**0.0**|
|LLaVA-1.5-7B|30.3|10.7|**0.0**|
|Random Choice|24.6|-|-|
## 📧 Contact
- [Lin Chen](https://lin-chen.site/): chlin@mail.ustc.edu.cn
- [Jinsong Li](https://li-jinsong.github.io/): lijingsong@pjlab.org.cn
## ✒️ Citation
If you find our work helpful for your research, please consider giving a star ⭐ and citation 📝
```bibtex
@article{chen2024we,
title={Are We on the Right Way for Evaluating Large Vision-Language Models?},
author={Chen, Lin and Li, Jinsong and Dong, Xiaoyi and Zhang, Pan and Zang, Yuhang and Chen, Zehui and Duan, Haodong and Wang, Jiaqi and Qiao, Yu and Lin, Dahua and others},
journal={arXiv preprint arXiv:2403.20330},
year={2024}
}
``` |
BangumiBase/darlinginthefranxx | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Darling In The Franxx
This is the image base of bangumi Darling in the FranXX, we detected 72 characters, 7520 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 874 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 61 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 38 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 17 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 53 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 14 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 7 | [Download](6/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 7 | 19 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 27 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 8 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 243 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 26 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 184 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 207 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 34 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 33 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 143 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 320 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 22 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 1188 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 137 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 43 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 41 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 44 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 38 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 35 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 32 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 15 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 40 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 29 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 15 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 13 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 21 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 15 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 7 | [Download](35/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 36 | 8 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 14 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 510 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 554 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 23 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 23 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 27 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 79 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 235 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 15 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 490 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 158 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 26 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 44 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 299 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 31 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 36 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 340 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 32 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 22 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 33 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 9 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 10 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 8 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 10 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 10 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 16 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 33 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 25 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 5 | [Download](65/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 66 | 25 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 16 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 37 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 24 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 18 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 211 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_robinsmits__Qwen1.5-7B-Dutch-Chat | ---
pretty_name: Evaluation run of robinsmits/Qwen1.5-7B-Dutch-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [robinsmits/Qwen1.5-7B-Dutch-Chat](https://huggingface.co/robinsmits/Qwen1.5-7B-Dutch-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_robinsmits__Qwen1.5-7B-Dutch-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T17:13:24.716005](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Qwen1.5-7B-Dutch-Chat/blob/main/results_2024-03-29T17-13-24.716005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6143669474488321,\n\
\ \"acc_stderr\": 0.03289058413769806,\n \"acc_norm\": 0.6246833994085433,\n\
\ \"acc_norm_stderr\": 0.03360371872268345,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.453395289107136,\n\
\ \"mc2_stderr\": 0.014822506332311901\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.014607794914013053,\n\
\ \"acc_norm\": 0.5392491467576792,\n \"acc_norm_stderr\": 0.014566303676636581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5656243776140211,\n\
\ \"acc_stderr\": 0.004946617138983514,\n \"acc_norm\": 0.7603067118103963,\n\
\ \"acc_norm_stderr\": 0.004260238033657902\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"\
acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548047,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548047\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295813,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295813\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.01498727064094601,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.01498727064094601\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n\
\ \"acc_stderr\": 0.015864506461604644,\n \"acc_norm\": 0.3418994413407821,\n\
\ \"acc_norm_stderr\": 0.015864506461604644\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015684,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015684\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061173,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061173\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.453395289107136,\n\
\ \"mc2_stderr\": 0.014822506332311901\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6882399368587214,\n \"acc_stderr\": 0.013018571197638551\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \
\ \"acc_stderr\": 0.0099597862209172\n }\n}\n```"
repo_url: https://huggingface.co/robinsmits/Qwen1.5-7B-Dutch-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|arc:challenge|25_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|gsm8k|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hellaswag|10_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T17-13-24.716005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T17-13-24.716005.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- '**/details_harness|winogrande|5_2024-03-29T17-13-24.716005.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T17-13-24.716005.parquet'
- config_name: results
data_files:
- split: 2024_03_29T17_13_24.716005
path:
- results_2024-03-29T17-13-24.716005.parquet
- split: latest
path:
- results_2024-03-29T17-13-24.716005.parquet
---
# Dataset Card for Evaluation run of robinsmits/Qwen1.5-7B-Dutch-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [robinsmits/Qwen1.5-7B-Dutch-Chat](https://huggingface.co/robinsmits/Qwen1.5-7B-Dutch-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_robinsmits__Qwen1.5-7B-Dutch-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T17:13:24.716005](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Qwen1.5-7B-Dutch-Chat/blob/main/results_2024-03-29T17-13-24.716005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6143669474488321,
"acc_stderr": 0.03289058413769806,
"acc_norm": 0.6246833994085433,
"acc_norm_stderr": 0.03360371872268345,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.453395289107136,
"mc2_stderr": 0.014822506332311901
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.014607794914013053,
"acc_norm": 0.5392491467576792,
"acc_norm_stderr": 0.014566303676636581
},
"harness|hellaswag|10": {
"acc": 0.5656243776140211,
"acc_stderr": 0.004946617138983514,
"acc_norm": 0.7603067118103963,
"acc_norm_stderr": 0.004260238033657902
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.025738330639412152,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.025738330639412152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295813,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295813
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.01498727064094601,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.01498727064094601
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3418994413407821,
"acc_stderr": 0.015864506461604644,
"acc_norm": 0.3418994413407821,
"acc_norm_stderr": 0.015864506461604644
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015684,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015684
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061173,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061173
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.453395289107136,
"mc2_stderr": 0.014822506332311901
},
"harness|winogrande|5": {
"acc": 0.6882399368587214,
"acc_stderr": 0.013018571197638551
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.0099597862209172
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ZachW/MGTDetect_CoCo | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: CoCo_MGT_Detection
size_categories:
- 10K<n<100K
---
This is the open-sourced dataset of our EMNLP 2023 paper **CoCo: Coherence-Enhanced Machine-Generated Text Detection Under Low Resource With Contrastive Learning** https://arxiv.org/abs/2212.10341 by XJTU.
The dataset contains new-style generated texts by GPT-2, Grover, and GPT-3.5 with up-to-date human-written texts from newspapers.
It aims to perform machine-generated text detection tasks. Check out our paper for deeper insights!
A more detailed description of the use case is at https://github.com/YichenZW/Coh-MGT-Detection. If you have any problem using it, please feel free to contact us! |
ovior/twitter_dataset_1713182919 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2442358
num_examples: 7197
download_size: 1409205
dataset_size: 2442358
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
terhdavid/proba_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 143190.77989130435
num_examples: 662
- name: test
num_bytes: 16006.220108695652
num_examples: 74
- name: validation
num_bytes: 16006.220108695652
num_examples: 74
download_size: 36090
dataset_size: 175203.22010869565
---
# Dataset Card for "proba_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tomeheya/IM-417-128 | ---
license: apache-2.0
pretty_name: Indus Script - IM-417 Sign list
---
Dataset containing all 417 IVC characters listed in "The Indus Scripts: Texts, Concordance and Tables" compiled by Iravatham Mahadevan in 128x128 resolution, hence calling this dataset "IM-417-128".
Label Citation: Pages 32-34 from "The Indus Scripts: Texts, Concordance and Tables" compiled by Iravatham Mahadevan, Published by Archaeological Survey of India (1977)
Characters were extracted from images of seals that belong to Indus Valley Civilisation.
The name of the enclosing subfolder is the label as it appears in the Sign List Pages 32-34.
Hence,
label = enclosing subfolder , data = image |
AISE-TUDelft/ML4SE23_G1_MBPP-SCoT | ---
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: MBPP enhanced dataset with Structured-Chain-of-Thought
size_categories:
- n<1K
---
# ML4SE23_G1_MBPP-SCoT
MBPP enhanced dataset with Structured-Chain-of-Thought |
open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1 | ---
pretty_name: Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xDAN2099/xDAN-L2-moe-2x-v1](https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T11:58:02.756350](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1/blob/main/results_2024-01-16T11-58-02.756350.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7649759059339861,\n\
\ \"acc_stderr\": 0.02802747077552357,\n \"acc_norm\": 0.7678303278344503,\n\
\ \"acc_norm_stderr\": 0.02857170363137812,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6176977126106841,\n\
\ \"mc2_stderr\": 0.014998426067966347\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205761,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6736705835490938,\n\
\ \"acc_stderr\": 0.004679111783653905,\n \"acc_norm\": 0.8630750846444931,\n\
\ \"acc_norm_stderr\": 0.0034306550069275773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n\
\ \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846948,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846948\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039776,\n\
\ \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039776\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747548,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"\
acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.019960225563172885,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.019960225563172885\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527036,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588796,\n\
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067326,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769591,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769591\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293648,\n \"\
acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293648\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8295964125560538,\n\
\ \"acc_stderr\": 0.025234593447136182,\n \"acc_norm\": 0.8295964125560538,\n\
\ \"acc_norm_stderr\": 0.025234593447136182\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n\
\ \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563274,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563274\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n\
\ \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.913154533844189,\n\
\ \"acc_stderr\": 0.010070298377747776,\n \"acc_norm\": 0.913154533844189,\n\
\ \"acc_norm_stderr\": 0.010070298377747776\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135022,\n\
\ \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135022\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7318435754189944,\n\
\ \"acc_stderr\": 0.014816119635317005,\n \"acc_norm\": 0.7318435754189944,\n\
\ \"acc_norm_stderr\": 0.014816119635317005\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041877,\n\
\ \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041877\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n\
\ \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614098,\n \
\ \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6160365058670143,\n\
\ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.6160365058670143,\n\
\ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370463,\n \
\ \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370463\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585633,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585633\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355024,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355024\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6176977126106841,\n\
\ \"mc2_stderr\": 0.014998426067966347\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7293404094010614,\n \
\ \"acc_stderr\": 0.012238245006183411\n }\n}\n```"
repo_url: https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-58-02.756350.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- '**/details_harness|winogrande|5_2024-01-16T11-58-02.756350.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T11-58-02.756350.parquet'
- config_name: results
data_files:
- split: 2024_01_16T11_58_02.756350
path:
- results_2024-01-16T11-58-02.756350.parquet
- split: latest
path:
- results_2024-01-16T11-58-02.756350.parquet
---
# Dataset Card for Evaluation run of xDAN2099/xDAN-L2-moe-2x-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xDAN2099/xDAN-L2-moe-2x-v1](https://huggingface.co/xDAN2099/xDAN-L2-moe-2x-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:58:02.756350](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN2099__xDAN-L2-moe-2x-v1/blob/main/results_2024-01-16T11-58-02.756350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7649759059339861,
"acc_stderr": 0.02802747077552357,
"acc_norm": 0.7678303278344503,
"acc_norm_stderr": 0.02857170363137812,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6176977126106841,
"mc2_stderr": 0.014998426067966347
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205761,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6736705835490938,
"acc_stderr": 0.004679111783653905,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846948,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.019960225563172885,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.019960225563172885
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527036,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588796,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769591,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769591
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8295964125560538,
"acc_stderr": 0.025234593447136182,
"acc_norm": 0.8295964125560538,
"acc_norm_stderr": 0.025234593447136182
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563274,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.913154533844189,
"acc_stderr": 0.010070298377747776,
"acc_norm": 0.913154533844189,
"acc_norm_stderr": 0.010070298377747776
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135022,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135022
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7318435754189944,
"acc_stderr": 0.014816119635317005,
"acc_norm": 0.7318435754189944,
"acc_norm_stderr": 0.014816119635317005
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041877,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6160365058670143,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.6160365058670143,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.015588643495370463,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.015588643495370463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585633,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355024,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6176977126106841,
"mc2_stderr": 0.014998426067966347
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598479
},
"harness|gsm8k|5": {
"acc": 0.7293404094010614,
"acc_stderr": 0.012238245006183411
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Muthuchancoach/Trichy_AI | ---
license: creativeml-openrail-m
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39367
num_examples: 177
download_size: 7655
dataset_size: 39367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DataStudio/OCR_document_redSeal | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 939108297.25
num_examples: 223830
download_size: 863369893
dataset_size: 939108297.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
task_categories:
- image-to-text
language:
- vi
pretty_name: OCR red seal document
size_categories:
- 100K<n<1M
---
# Dataset Card for "OCR_document_redMark"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Othmanotana/darija | ---
license: unknown
---
|
YBXL/GI_Reasoning_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4942725
num_examples: 1462
- name: valid
num_bytes: 4942725
num_examples: 1462
- name: test
num_bytes: 4942725
num_examples: 1462
download_size: 7369257
dataset_size: 14828175
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Reindrob/civ | ---
license: unknown
---
|
open-llm-leaderboard/details_sbawa__elysa_model | ---
pretty_name: Evaluation run of sbawa/elysa_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sbawa/elysa_model](https://huggingface.co/sbawa/elysa_model) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sbawa__elysa_model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T19:58:53.953436](https://huggingface.co/datasets/open-llm-leaderboard/details_sbawa__elysa_model/blob/main/results_2024-03-29T19-58-53.953436.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.262045867176558,\n\
\ \"acc_stderr\": 0.030953265760798498,\n \"acc_norm\": 0.26370242732678956,\n\
\ \"acc_norm_stderr\": 0.031728717538057886,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757449,\n \"mc2\": 0.3736544015528367,\n\
\ \"mc2_stderr\": 0.013842660843141093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34044368600682595,\n \"acc_stderr\": 0.01384746051889298,\n\
\ \"acc_norm\": 0.37542662116040953,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4536944831706831,\n\
\ \"acc_stderr\": 0.00496833714413636,\n \"acc_norm\": 0.6036646086436964,\n\
\ \"acc_norm_stderr\": 0.004881359589149009\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.0335567721631314,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.0335567721631314\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.24193548387096775,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417393,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023187,\n \"\
acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023187\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749482,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749482\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.016050792148036546,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.016050792148036546\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331161,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331161\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.20212765957446807,\n \"acc_stderr\": 0.02395666823785022,\n \
\ \"acc_norm\": 0.20212765957446807,\n \"acc_norm_stderr\": 0.02395666823785022\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.02725720260611495,\n\
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.02725720260611495\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.024127463462650135,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.024127463462650135\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853319,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853319\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757449,\n \"mc2\": 0.3736544015528367,\n\
\ \"mc2_stderr\": 0.013842660843141093\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749027\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.0026153265107756716\n }\n}\n```"
repo_url: https://huggingface.co/sbawa/elysa_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-58-53.953436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-58-53.953436.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- '**/details_harness|winogrande|5_2024-03-29T19-58-53.953436.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T19-58-53.953436.parquet'
- config_name: results
data_files:
- split: 2024_03_29T19_58_53.953436
path:
- results_2024-03-29T19-58-53.953436.parquet
- split: latest
path:
- results_2024-03-29T19-58-53.953436.parquet
---
# Dataset Card for Evaluation run of sbawa/elysa_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sbawa/elysa_model](https://huggingface.co/sbawa/elysa_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sbawa__elysa_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T19:58:53.953436](https://huggingface.co/datasets/open-llm-leaderboard/details_sbawa__elysa_model/blob/main/results_2024-03-29T19-58-53.953436.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.262045867176558,
"acc_stderr": 0.030953265760798498,
"acc_norm": 0.26370242732678956,
"acc_norm_stderr": 0.031728717538057886,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757449,
"mc2": 0.3736544015528367,
"mc2_stderr": 0.013842660843141093
},
"harness|arc:challenge|25": {
"acc": 0.34044368600682595,
"acc_stderr": 0.01384746051889298,
"acc_norm": 0.37542662116040953,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.4536944831706831,
"acc_stderr": 0.00496833714413636,
"acc_norm": 0.6036646086436964,
"acc_norm_stderr": 0.004881359589149009
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.0335567721631314,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.0335567721631314
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417393,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749482,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036546,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036546
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331161,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331161
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20212765957446807,
"acc_stderr": 0.02395666823785022,
"acc_norm": 0.20212765957446807,
"acc_norm_stderr": 0.02395666823785022
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.02725720260611495,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.02725720260611495
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.024127463462650135,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.024127463462650135
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853319,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853319
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757449,
"mc2": 0.3736544015528367,
"mc2_stderr": 0.013842660843141093
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749027
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.0026153265107756716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
P1ayer-1/tiny_stories_packed | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2146599252.0
num_examples: 1046101
download_size: 894178226
dataset_size: 2146599252.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tiny_stories_packed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hsultanbey/code-dataset | ---
dataset_info:
features:
- name: language
dtype: string
- name: func_code_string
dtype: string
splits:
- name: train
num_bytes: 764364445.0859526
num_examples: 857962
- name: test
num_bytes: 7721491.914047418
num_examples: 8667
- name: valid
num_bytes: 35647063.0
num_examples: 38435
download_size: 364838286
dataset_size: 807733000.0
---
# Dataset Card for "code-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approach0/no-asy-precalculus-topics-by-queryLM | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: src_path
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: out_str
dtype: string
- name: tool_res
sequence: string
splits:
- name: test
num_bytes: 2199344
num_examples: 392
download_size: 675379
dataset_size: 2199344
---
# Dataset Card for "no-asy-precalculus-topics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/story_1_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3199
num_examples: 10
download_size: 4429
dataset_size: 3199
---
# Dataset Card for "story_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NarchAI1992/lora_townhouse | ---
license: openrail
---
|
autoevaluate/autoeval-staging-eval-project-glue-f6cacc01-14075929 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: mrm8488/deberta-v3-small-finetuned-sst2
metrics: []
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: mrm8488/deberta-v3-small-finetuned-sst2
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
andreaponti/NDC-sectors | ---
task_categories:
- text-classification
language:
- en
- es
tags:
- climate
pretty_name: NDC Sector Classification
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: train
path: "NDC_sectors.csv"
- config_name: sector_description
data_files: "sectors.json"
---
# NDC Sector Classification
This dataset is built from the tagged NDC ([Climate Watch](https://www.climatewatchdata.org/data-explorer/historical-emissions?historical-emissions-data-sources=climate-watch&historical-emissions-gases=all-ghg&historical-emissions-regions=All%20Selected&historical-emissions-sectors=total-including-lucf%2Ctotal-including-lucf&page=1)) paragraphs made by [GIZ Data Service Center](https://www.giz.de/expertise/html/63018.html) and available on Hugging Face ([GIZ/policy_qa_v0](https://huggingface.co/datasets/GIZ/policy_qa_v0)).
The NDC urls have been taken from [IGES NDC Database](https://www.iges.or.jp/en/pub/iges-indc-ndc-database/en).
Each NDC have been classified in a specific sector if it contains at least a paragraph classified as the specific sector. Each NDC can be associated to multiple sector.
The dataset contains 250 document classified in 18 sectors. The followin plot shows the number of documents tagged as each sector.

## NDC Data
The csv containing the tagged NDC is structured as follows:
- `Country`: The country to which the NDC refers.
- `Document`: The type of document (INDC, First NDC, Second NDC).
- `Language`: The original language of the NDC.
- `Sector`: A json whose keys represent the sectors mentioned in the NDC and whose values represent the number of paragraphs that mention the specific secotor.
- `URL`: The pdf url.
## Sector Data
The json containing the sectors' description follows the scheme below:
```json
{
"topic_list_id":"UUID",
"topics":[
{
"topic_id":"integer",
"topic_name":"string",
"definitions":[
{
"lang":"string",
"description":"string"
}
]
}
]
}
```
**Note:** The descriptions have been taken from Wikipedia (en). The Spanish version is a translation of the english one. |
airaspberry/hoodie-cad | ---
license: openrail
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_109 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1285093856.0
num_examples: 250408
download_size: 1316324622
dataset_size: 1285093856.0
---
# Dataset Card for "chunk_109"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miriad/miriad-v0-6M | ---
dataset_info:
features:
- name: qa_id
dtype: string
- name: paper_id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: paper_url
dtype: string
- name: paper_title
dtype: string
- name: passage_text
dtype: string
- name: passage_position
dtype: string
- name: year
dtype: int64
splits:
- name: train
num_bytes: 34415615449
num_examples: 6430601
download_size: 8205967742
dataset_size: 34415615449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JosephFeig/Assignment2A | ---
dataset_info:
features:
- name: key
dtype: string
- name: fare_amount
dtype: float64
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: train
num_bytes: 10667707
num_examples: 100000
download_size: 6806842
dataset_size: 10667707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
billfass/ALFFA_PUBLIC | ---
license: afl-3.0
---
|
AshanGimhana/Testingdata | ---
license: mit
---
|
Wanfq/Explore_Instruct_Rewriting_10k | ---
license: cc-by-nc-4.0
language:
- en
---
<p align="center" width="100%">
</p>
<div id="top" align="center">
**Explore-Instruct: Enhancing Domain-Specific Instruction Coverage through Active Exploration**
<h4> |<a href="https://arxiv.org/abs/2310.09168"> 📑 Paper </a> |
<a href="https://huggingface.co/datasets?sort=trending&search=Explore_Instruct"> 🤗 Data </a> |
<a href="https://huggingface.co/models?sort=trending&search=Explore-LM"> 🤗 Model </a> |
<a href="https://github.com/fanqiwan/Explore-Instruct"> 🐱 Github Repo </a> |
</h4>
<!-- **Authors:** -->
_**Fanqi Wan<sup>†</sup>, Xinting Huang<sup>‡</sup>, Tao Yang<sup>†</sup>, Xiaojun Quan<sup>†</sup>, Wei Bi<sup>‡</sup>, Shuming Shi<sup>‡</sup>**_
<!-- **Affiliations:** -->
_<sup>†</sup> Sun Yat-sen University,
<sup>‡</sup> Tencent AI Lab_
</div>
## News
- **Oct 16, 2023:** 🔥 We're excited to announce that the Explore-Instruct datasets in brainstorming, rewriting, and math domains are now available on 🤗 [Huggingface Datasets](https://huggingface.co/datasets?sort=trending&search=Explore_Instruct)! Additionally, we've released Explore-LM models that have been initialized with LLaMA-7B and fine-tuned with the Explore-Instruct data in each domain. You can find these models on 🤗 [Huggingface Models](https://huggingface.co/models?sort=trending&search=Explore-LM). Happy exploring and instructing!
## Contents
- [Overview](#overview)
- [Data Release](#data-release)
- [Model Release](#model-release)
- [Data Generation Process](#data-generation-process)
- [Fine-tuning](#fine-tuning)
- [Evaluation](#evaluation)
- [Limitations](#limitations)
- [License](#license)
- [Citation](#citation)
- [Acknowledgements](#acknowledgments)
## Overview
We propose Explore-Instruct, a novel approach to enhancing domain-specific instruction coverage. We posit that the domain space is inherently structured akin to a tree, reminiscent of cognitive science ontologies. Drawing from the essence of classical search algorithms and incorporating the power of LLMs, Explore-Instruct is conceived to actively traverse the domain space and generate instruction-tuning data, **not** necessitating a predefined tree structure. Specifically, Explore-Instruct employs two strategic operations: lookahead and backtracking exploration:
- **Lookahead** delves into a multitude of potential fine-grained sub-tasks, thereby mapping out a complex network of tasks
- **Backtracking** seeks alternative branches to widen the search boundary, hence extending the domain spectrum.
<p align="center">
<img src="https://github.com/fanqiwan/Explore-Instruct/blob/main/assets/fig2.png?raw=true" width="95%"> <br>
</p>
## Data Release
We release the Explore-Instruct data in brainstorming, rewriting, and math domains on 🤗 [Huggingface Datasets](https://huggingface.co/datasets?sort=trending&search=Explore_Instruct). Each domain includes two versions of datasets: the basic and extended version. The base version contains 10k instruction-tuning data and the extended version contains 16k, 32k, and 64k instruction-tuning data for each domain respectively. Each dataset is a structured data file in the JSON format. It consists of a list of dictionaries, with each dictionary containing the following fields:
- `instruction`: `str`, describes the task the model should perform.
- `input`: `str`, optional context or input for the task.
- `output`: `str`, ground-truth output text for the task and input text.
The results of data-centric analysis are shown as follows:
<p align="left">
<img src="https://github.com/fanqiwan/Explore-Instruct/blob/main/assets/fig1.png?raw=true" width="50%"> <br>
</p>
| Method | Brainstorming Unique<br/>V-N pairs | Rewriting Unique<br/>V-N pairs | Math Unique<br/>V-N pairs |
|:--------------------------------|:----------------------------------:|:------------------------------:|:-------------------------:|
| _Domain-Specific Human-Curated_ | 2 | 8 | 3 |
| _Domain-Aware Self-Instruct_ | 781 | 1715 | 451 |
| Explore-Instruct | **790** | **2015** | **917** |
## Model Release
We release the Explore-LM models in brainstorming, rewriting, and math domains on 🤗 [Huggingface Models](https://huggingface.co/models?sort=trending&search=Explore-LM). Each domain includes two versions of models: the basic and extended version trained with the corresponding version of dataset.
The results of automatic and human evaluation in three domains are shown as follows:
- Automatic evaluation:
| Automatic Comparison in the Brainstorming Domain | Win:Tie:Lose | Beat Rate |
|:-------------------------------------------------|:------------:|:---------:|
| Explore-LM vs Domain-Curated-LM | 194:1:13 | 93.72 |
| Explore-LM-Ext vs Domain-Curated-LM | 196:1:11 | 94.69 |
| Explore-LM vs Domain-Instruct-LM | 114:56:38 | 75.00 |
| Explore-LM-Ext vs Domain-Instruct-LM | 122:55:31 | 79.74 |
| Explore-LM vs ChatGPT | 52:71:85 | 37.96 |
| Explore-LM-Ext vs ChatGPT | 83:69:56 | 59.71 |
| Automatic Comparison in the Rewriting Domain | Win:Tie:Lose | Beat Rate |
|:---------------------------------------------|:------------:|:---------:|
| Explore-LM vs Domain-Curated-LM | 50:38:6 | 89.29 |
| Explore-LM-Ext vs Domain-Curated-LM | 53:37:4 | 92.98 |
| Explore-LM vs Domain-Instruct-LM | 34:49:11 | 75.56 |
| Explore-LM-Ext vs Domain-Instruct-LM | 35:53:6 | 85.37 |
| Explore-LM vs ChatGPT | 11:59:24 | 31.43 |
| Explore-LM-Ext vs ChatGPT | 12:56:26 | 31.58 |
| Automatic Comparison in the Math Domain | Accuracy Rate |
|:----------------------------------------|:-------------:|
| Domain-Curated-LM | 3.4 |
| Domain-Instruct-LM | 4.0 |
| Explore-LM | 6.8 |
| Explore-LM-Ext | 8.4 |
| ChatGPT | 34.8 |
- Human evaluation:
<p align="left">
<img src="https://github.com/fanqiwan/Explore-Instruct/blob/main/assets/fig5.png?raw=true" width="95%"> <br>
</p>
## Data Generation Process
To generate the domain-specific instruction-tuning data, please follow the following commands step by step:
### Domain Space Exploration
```
python3 generate_instruction.py \
--action extend \
--save_dir ./en_data/demo_domain \ # input dir include current domain tree for exploration
--out_dir ./en_data/demo_domain_exploration \ # output dir of the explored new domain tree
--lang <LANGUAGE> \ # currently support 'en'
--domain demo_domain \ # domain for exploration
--extend_nums <TASK_NUMBER_DEPTH_0>,...,<TASK_NUMBER_DEPTH_MAX_DEPTH-1> \ # exploration breadth at each depth
--max_depth <MAX_DEPTH> \ # exploration depth
--assistant_name <ASSISTANT_NAME> # currently support openai and claude
```
### Instruction-Tuning Data Generation
```
python3 generate_instruction.py \
--action enrich \
--save_dir ./en_data/demo_domain_exploration \ # input dir include current domain tree for data generation
--out_dir ./en_data/demo_domain_generation \ # output dir of the domain tree with generated data
--lang <LANGUAGE> \ # currently support 'en'
--domain demo_domain \ # domain for exploration
--enrich_nums <DATA_NUMBER_DEPTH_0>,...,<DATA_NUMBER_DEPTH_MAX_DEPTH> \ # data number for task at each depth
--enrich_batch_size <BATCH_SIZE> \ # batch size for data generation
--assistant_name <ASSISTANT_NAME> # currently support openai and claude
```
### Task Pruning
```
python3 generate_instruction.py \
--action prune \
--save_dir ./en_data/demo_domain_generation \ # input dir include current domain tree for task pruning
--out_dir ./en_data/demo_domain_pruning \ # output dir of the domain tree with 'pruned_subtasks_name.json' file
--lang <LANGUAGE> \ # currently support 'en'
--domain demo_domain \ # domain for exploration
--pruned_file ./en_data/demo_domain_pruning/pruned_subtasks_name.json \ # file of pruned tasks
--prune_threshold <PRUNE_THRESHOLD> \ # threshold of rouge-l overlap between task names
--assistant_name <ASSISTANT_NAME> # currently support openai and claude
```
### Data Filtering
```
python3 generate_instruction.py \
--action filter \
--save_dir ./en_data/demo_domain_pruning \ # input dir include current domain tree for data filtering
--out_dir ./en_data/demo_domain_filtering \ # output dir of the domain tree with fitered data
--lang <LANGUAGE> \ # currently support 'en'
--domain demo_domain \ # domain for exploration
--pruned_file ./en_data/demo_domain_pruning/pruned_subtasks_name.json \ # file of pruned tasks
--filter_threshold <FILTER_THRESHOLD> \ # threshold of rouge-l overlap between instructions
--assistant_name <ASSISTANT_NAME> # currently support openai and claude
```
### Data Sampling
```
python3 generate_instruction.py \
--action sample \
--save_dir ./en_data/demo_domain_filtering \ # input dir include current domain tree for data sampling
--out_dir ./en_data/demo_domain_sampling \ # output dir of the domain tree with sampled data
--lang <LANGUAGE> \ # currently support 'en'
--domain demo_domain \ # domain for exploration
--pruned_file ./en_data/demo_domain_filtering/pruned_subtasks_name.json \ # file of pruned tasks
--sample_example_num <SAMPLE_EXAMPLES_NUM> \ # number of sampled examples
--sample_max_depth <SAMPLE_MAX_DEPTH> \ # max depth for data sampling
--sample_use_pruned \ # do not sample from pruned tasks
--assistant_name <ASSISTANT_NAME> # currently support openai and claude
```
## Fine-tuning
We fine-tune LLaMA-7B with the following hyperparameters:
| Hyperparameter | Global Batch Size | Learning rate | Epochs | Max length | Weight decay |
|:----------------|-------------------:|---------------:|--------:|------------:|--------------:|
| LLaMA 7B | 128 | 2e-5 | 3 | 512| 0 |
To reproduce the training procedure, please use the following command:
```
deepspeed --num_gpus=8 ./train/train.py \
--deepspeed ./deepspeed_config/deepspeed_zero3_offload_config.json \
--model_name_or_path decapoda-research/llama-7b-hf \
--data_path ./en_data/demo_domain_sampling \
--fp16 True \
--output_dir ./training_results/explore-lm-7b-demo-domain \
--num_train_epochs 3 \
--per_device_train_batch_size 2 \
--per_device_eval_batch_size 2 \
--gradient_accumulation_steps 8 \
--evaluation_strategy "no" \
--model_max_length 512 \
--save_strategy "steps" \
--save_steps 2000 \
--save_total_limit 1 \
--learning_rate 2e-5 \
--weight_decay 0. \
--warmup_ratio 0.03 \
--lr_scheduler_type "cosine" \
--logging_steps 1 \
--prompt_type alpaca \
2>&1 | tee ./training_logs/explore-lm-7b-demo-domain.log
python3 ./train/zero_to_fp32.py \
--checkpoint_dir ./training_results/explore-lm-7b-demo-domain \
--output_file ./training_results/explore-lm-7b-demo-domain/pytorch_model.bin
```
## Evaluation
The evaluation datasets for different domains are as follows:
- Brainstorming and Rewriting: From the corresponding categories in the translated test set of BELLE. ([en_eval_set.jsonl](./eval/question/en_eval_set.jsonl))
- Math: From randomly selected 500 questions from the test set of MATH. ([MATH_eval_set_sample.jsonl](./eval/question/MATH_eval_set_sample.jsonl))
The evaluation metrics for different domains are as follows:
- Brainstorming and Rewriting: Both automatic and human evaluations following Vicuna.
- Math: Accuracy Rate metric in solving math problems.
The automatic evaluation commands for different domains are as follows:
```
# Brainstorming and Rewriting Domain
# 1. Inference
python3 ./eval/generate.py \
--model_id <MODEL_ID> \
--model_path <MODEL_PATH> \
--question_file ./eval/question/en_eval_set.jsonl \
--answer_file ./eval/answer/<MODEL_ID>.jsonl \
--num_gpus 8 \
--num_beams 1 \
--temperature 0.7 \
--max_new_tokens 512 \
--prompt_type alpaca \
--do_sample
# 2. Evaluation
python3 ./eval/chatgpt_score.py \
--baseline_file ./eval/answer/<MODEL_1>.jsonl \ # answer of baseline model to compare with
--answer_file ./eval/answer/<MODEL_2>.jsonl \ # answer of evaluation model
--review_file ./eval/review/<MODEL_1>_cp_<MODEL_2>_<DOMAIN>.jsonl \ # review from chatgpt
--prompt_file ./eval/prompt/en_review_prompt_compare.jsonl \ # evaluation prompt for chatgpt
--target_classes <DOMAIN> \ # evaluation domain
--batch_size <BATCH_SIZE> \
--review_model "gpt-3.5-turbo-0301"
```
```
# Math Domain
# 1. Inference
python3 ./eval/generate.py \
--model_id <MODEL_ID> \
--model_path <MODEL_PATH> \
--question_file ./eval/question/MATH_eval_set_sample.jsonl \
--answer_file ./eval/answer/<MODEL_ID>.jsonl \
--num_gpus 8 \
--num_beams 10 \
--temperature 1.0 \
--max_new_tokens 512 \
--prompt_type alpaca
# 2. Evaluation
python3 ./eval/auto_eval.py \
--question_file ./eval/question/MATH_eval_set_sample.jsonl \
--answer_file ./eval/answer/<MODEL_ID>.jsonl # answer of evaluation model
```
## Limitations
Explore-Instruct is still under development and needs a lot of improvements. We acknowledge that our work focuses on the enhancement of domain-specific instruction coverage and does not address other aspects of instruction-tuning, such as the generation of complex and challenging instructions or the mitigation of toxic and harmful instructions. Future work is needed to explore the potential of our approach in these areas.
## License
Explore-Instruct is intended and licensed for research use only. The dataset is CC BY NC 4.0 (allowing only non-commercial use) and models trained using the dataset should not be used outside of research purposes. The weights of Explore-LM models are also CC BY NC 4.0 (allowing only non-commercial use).
## Citation
If you find this work is relevant with your research or applications, please feel free to cite our work!
```
@misc{wan2023explore,
title={Explore-Instruct: Enhancing Domain-Specific Instruction Coverage through Active Exploration},
author={Fanqi, Wan and Xinting, Huang and Tao, Yang and Xiaojun, Quan and Wei, Bi and Shuming, Shi},
year={2023},
eprint={2310.09168},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Acknowledgments
This repo benefits from [Stanford-Alpaca](https://github.com/tatsu-lab/stanford_alpaca) and [Vicuna](https://github.com/lm-sys/FastChat). Thanks for their wonderful works!
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-74000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1020572
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-22d4f209-4087-42ac-a9a4-6d47e201055d-6458 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-book-summary
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-book-summary
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
QiyaoWei/Reproducing-DPO | ---
license: apache-2.0
---
|
ai-forever/spellcheck_punctuation_benchmark | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ru
license: mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- text-generation
pretty_name: Russian Spellcheck Punctuation Benchmark
language_bcp47:
- ru-RU
tags:
- spellcheck
- russian
---
# Dataset Card for Russian Spellcheck Punctuation Benchmark
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [SAGE](https://github.com/ai-forever/sage)
- **Paper:** [EACL 2024 paper](https://aclanthology.org/2024.findings-eacl.10/)
- **Point of Contact:** nikita.martynov.98@list.ru
### Dataset Summary
The collection is an updated version of [Russian Spellcheck Benchmark](https://huggingface.co/datasets/ai-forever/spellcheck_benchmark) with punctuation corrected.
The Benchmark includes four datasets, each of which consists of pairs of sentences in Russian language.
Each pair embodies sentence, which may contain spelling and punctuation errors, and its corresponding correction.
Datasets were gathered from various sources and domains including social networks, internet blogs, github commits, medical anamnesis, literature, news, reviews and more.
All datasets were passed through two-stage manual labeling pipeline.
The correction of a sentence is defined by an agreement of at least two human annotators.
Manual labeling scheme accounts for jargonisms, collocations and common language, hence in some cases it encourages
annotators not to amend a word in favor of preserving style of a text.
The latter does not apply to punctuation. Punctuation signs are rigorously marked in accordance to the rules of the Russian punctuation system.
### Supported Tasks and Leaderboards
- **Task:** automatic spelling correction.
- **Metrics:** https://www.dialog-21.ru/media/3427/sorokinaaetal.pdf.
- **ERRANT:** https://github.com/chrisjbryant/errant.
### Languages
Russian.
## Dataset Structure
### Data Instances
#### RUSpellRU
- **Size of downloaded dataset files:** 3.65 Mb
- **Size of the generated dataset:** 1.31 Mb
- **Total amount of disk used:** 4.96 Mb
An example of "train" / "test" looks as follows
```
{
"source": "давольно милый и летом и зимой обогреваемый теплым солнушком",
"correction": "Довольно милый, и летом, и зимой обогреваемый тёплым солнышком.",
}
```
#### MultidomainGold
- **Size of downloaded dataset files:** 15.03 Mb
- **Size of the generated dataset:** 5.43 Mb
- **Total amount of disk used:** 20.46 Mb
An example of "test" looks as follows
```
{
"source": "для меня всё материальное тленно и лишь находясь в гармонии-для начала с собой-можно радовацца чужому счастью искренне",
"correction": "Для меня всё материальное тленно, и лишь находясь в гармонии - для начала с собой - можно радоваться чужому счастью искренне.",
"domain": "web",
}
```
#### MedSpellcheck
- **Size of downloaded dataset files:** 1.49 Mb
- **Size of the generated dataset:** 0.54 Mb
- **Total amount of disk used:** 2.03 Mb
An example of "test" looks as follows
```
{
"source": "Накануне (18.02.2012 г",
"correction": "Накануне (18.02.2012 г.).",
}
```
#### GitHubTypoCorpusRu
- **Size of downloaded dataset files:** 1.23 Mb
- **Size of the generated dataset:** 0.48 Mb
- **Total amount of disk used:** 1.71 Mb
An example of "test" looks as follows
```
{
"source": "text: Пожалуйста выберите чат, чтобы начать общение",
"correction": "text: Пожалуйста, выберите чат, чтобы начать общение.",
}
```
### Data Fields
#### RUSpellRU
- `source`: a `string` feature
- `correction`: a `string` feature
- `domain`: a `string` feature
#### MultidomainGold
- `source`: a `string` feature
- `correction`: a `string` feature
- `domain`: a `string` feature
#### MedSpellcheck
- `source`: a `string` feature
- `correction`: a `string` feature
- `domain`: a `string` feature
#### GitHubTypoCorpusRu
- `source`: a `string` feature
- `correction`: a `string` feature
- `domain`: a `string` feature
### Data Splits
#### RUSpellRU
| |train|test|
|---|---:|---:|
|RUSpellRU|2000|2008|
#### MultidomainGold
| |train|test|
|---|---:|---:|
|web|385|756|
|news|361|245|
|social_media|430|200|
|reviews|583|585|
|subtitles|1810|1810|
|strategic_documents|-|250|
|literature|-|260|
#### MedSpellcheck
| |test|
|---|---:|
|MedSpellcheck|1054|
#### GitHubTypoCorpusRu
| |test|
|---|---:|
|GitHubTypoCorpusRu|868|
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The datasets are chosen in accordance with the specified criteria.
First, domain variation: half of the datasets are chosen from different domains to ensure diversity, while the remaining half are from a single domain.
Another criterion is presence of spelling orthographic and punctuation mistakes:
the datasets exclusively comprised mistyping, omitting grammatical or more complex errors of nonnative speakers.
- **RUSpellRU**: texts collected from ([LiveJournal](https://www.livejournal.com/media)), with manually corrected typos and errors;
- **MultidomainGold**: examples from several text sources including the open web, news, social media, reviews, subtitles, policy documents and literary works were collected:
*Aranea web-corpus* is a family of multilanguage gigaword web-corpora collected from Internet resources. The texts in the corpora are evenly distributed across periods, writing styles and topics they cover. We randomly picked the sentences from Araneum Russicum, which is harvested from the Russian part of the web.
*Literature* is a collection of Russian poems and prose of different classical literary works. We randomly picked sentences from the source dataset that were gathered from Ilibrary, LitLib, and Wikisource.
*News*, as the name suggests, covers news articles on various topics such as sports, politics, environment, economy etc. The passages are randomly picked from the summarization dataset Gazeta.ru.
*Social media* is the text domain from social media platforms marked with specific hashtags. These texts are typically short, written in an informal style and may contain slang, emojis and obscene lexis.
*Strategic Documents* is part of the dataset the Ministry of Economic Development of the Russian Federation collected. Texts are written in a bureaucratic manner, rich in embedded entities, and have complex syntactic and discourse structures. The full version of the dataset has been previously used in the RuREBus shared task.
- **MedSpellChecker**: texts with errors from medical anamnesis;
- **GitHubTypoCorpusRu**: spelling errors and typos in commits from [GitHub](https://github.com);
### Annotations
#### Annotation process
We set up two-stage annotation project via a crowd-sourcing platform Toloka:
1. Data gathering stage: we provide the texts with possible mistakes to annotators and ask them to write the sentence correctly;
2. Validation stage: we provide annotators with the pair of sentences (source and its corresponding correction from the previous stage) and ask them to check if the correction is right.
We prepared instructions for annotators for each task. The instructions ask annotators to correct misspellings if it does not alter the original style of the text.
Instructions do not provide rigorous criteria on the matter of distinguishing the nature of an error in terms of its origin - whether it came from an urge to endow a sentence with particular stylistic features or from unintentional spelling violation since it is time-consuming and laborious to describe every possible case of employing slang, dialect, collo- quialisms, etc. instead of proper language. Instructions also do not distinguish errors that come from the geographical or social background of the source. Instead, we rely on annotators’ knowledge and understanding of a language since, in this work, the important factor is to preserve the original style of the text.
To ensure we receive qualified expertise, we set up test iteration on a small subset of the data for both stages. We manually validated the test results and selected annotators, who processed at least six samples (2% of the total test iteration) and did not make a single error. After test iteration, we cut 85% and 86% of labellers for gathering and validation stages.
We especially urge annotators to correct mistakes associated with the substitution of the letters "ё" "й" and "щ" for corresponding "е" "и" and "ш" and not to explain abbreviations and correct punctuation errors. Each annotator is also warned about potentially sensitive topics in data (e.g., politics, societal minorities, and religion).
The annotation of punctuation errors has been done in one iteration considering the low variation and difficulty of the task (relative to spelling correction). The annotators have been asked to correct punctuation signs in accordance with the rules of the Russian punctuation system.
#### Who are the annotators?
Native Russian speakers who passed the language exam.
The annotators for punctuation errors are also professional editors and linguists.
## Considerations for Using the Data
### Discussion of Biases
We clearly state our work’s aims and
implications, making it open source and transparent. The data will be available under a public license. As our research involved anonymized textual data, informed consent from human participants was not required. However, we obtained permission to access publicly available datasets and
ensured compliance with any applicable terms of
service or usage policies.
### Other Known Limitations
The data used in our research may be limited to specific
domains, preventing comprehensive coverage of
all possible text variations. Despite these limitations, we tried to address the issue of data diversity
by incorporating single-domain and multi-domain
datasets in the proposed research. This approach
allowed us to shed light on the diversity and variances within the data, providing valuable insights
despite the inherent constraints.
We primarily focus on the Russian language. Further
research is needed to expand the datasets for a wider
range of languages.
## Additional Information
### Future plans
We are planning to expand our benchmark with both new Russian datasets and datasets in other languages including (but not limited to) European and CIS languages.
If you would like to contribute, please contact us.
### Dataset Curators
Nikita Martynov nikita.martynov.98@list.ru
### Licensing Information
All our datasets are published by MIT License.
### Citation Information
```
@inproceedings{martynov2023augmentation,
title={Augmentation methods for spelling corruptions},
author={Martynov, Nikita and Baushenko, Mark and Abramov, Alexander and Fenogenova, Alena},
booktitle={Proceedings of the International Conference “Dialogue},
volume={2023},
year={2023}
}
@inproceedings{martynov-etal-2024-methodology,
title = "A Methodology for Generative Spelling Correction via Natural Spelling Errors Emulation across Multiple Domains and Languages",
author = "Martynov, Nikita and
Baushenko, Mark and
Kozlova, Anastasia and
Kolomeytseva, Katerina and
Abramov, Aleksandr and
Fenogenova, Alena",
editor = "Graham, Yvette and
Purver, Matthew",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
month = mar,
year = "2024",
address = "St. Julian{'}s, Malta",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.findings-eacl.10",
pages = "138--155",
abstract = "Large language models excel in text generation and generalization, however they face challenges in text editing tasks, especially in correcting spelling errors and mistyping.In this paper, we present a methodology for generative spelling correction (SC), tested on English and Russian languages and potentially can be extended to any language with minor changes. Our research mainly focuses on exploring natural spelling errors and mistyping in texts and studying how those errors can be emulated in correct sentences to enrich generative models{'} pre-train procedure effectively. We investigate the effects of emulations in various text domains and examine two spelling corruption techniques: 1) first one mimics human behavior when making a mistake through leveraging statistics of errors from a particular dataset, and 2) second adds the most common spelling errors, keyboard miss clicks, and some heuristics within the texts.We conducted experiments employing various corruption strategies, models{'} architectures, and sizes in the pre-training and fine-tuning stages and evaluated the models using single-domain and multi-domain test sets. As a practical outcome of our work, we introduce SAGE (Spell checking via Augmentation and Generative distribution Emulation).",
}
``` |
bigscience-data/roots_ca_ted_talks_iwslt | ---
language: ca
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ca_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
CyberHarem/dyute_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dyute (Fire Emblem)
This is the dataset of dyute (Fire Emblem), containing 160 images and their tags.
The core tags of this character are `brown_hair, ponytail, bow, brown_eyes, long_hair, fang, hair_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 160 | 163.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dyute_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 160 | 101.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dyute_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 353 | 210.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dyute_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 160 | 149.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dyute_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 353 | 276.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dyute_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dyute_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, bare_shoulders, bracelet, breastplate, open_mouth, simple_background, solo, cape, smile, boots, white_background, blush, dress, full_body |
| 1 | 15 |  |  |  |  |  | nipples, 1girl, nude, blush, navel, pussy, open_mouth, small_breasts, censored, solo_focus, spread_legs, 1boy, hetero, looking_at_viewer, sex, simple_background, smile, vaginal, penis |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | bracelet | breastplate | open_mouth | simple_background | solo | cape | smile | boots | white_background | blush | dress | full_body | nipples | nude | navel | pussy | small_breasts | censored | solo_focus | spread_legs | 1boy | hetero | looking_at_viewer | sex | vaginal | penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:--------------|:-------------|:--------------------|:-------|:-------|:--------|:--------|:-------------------|:--------|:--------|:------------|:----------|:-------|:--------|:--------|:----------------|:-----------|:-------------|:--------------|:-------|:---------|:--------------------|:------|:----------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | | | | X | X | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ismailiismail/paragraphss_paraphrasing | ---
dataset_info:
features:
- name: phrase
dtype: string
- name: paraphrase
dtype: string
splits:
- name: train
num_bytes: 1848761
num_examples: 1000
download_size: 963985
dataset_size: 1848761
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "paragraphss_paraphrasing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skrishna/coin_flip_7 | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
splits:
- name: test
num_bytes: 568628
num_examples: 2000
- name: train
num_bytes: 568912
num_examples: 2000
download_size: 288821
dataset_size: 1137540
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MayaPH__opt-flan-iml-6.7b | ---
pretty_name: Evaluation run of MayaPH/opt-flan-iml-6.7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MayaPH/opt-flan-iml-6.7b](https://huggingface.co/MayaPH/opt-flan-iml-6.7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__opt-flan-iml-6.7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T03:06:32.697788](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__opt-flan-iml-6.7b/blob/main/results_2023-10-13T03-06-32.697788.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07518875838926174,\n\
\ \"em_stderr\": 0.002700490526265294,\n \"f1\": 0.10838401845637569,\n\
\ \"f1_stderr\": 0.0028760995167941457,\n \"acc\": 0.3212312549329124,\n\
\ \"acc_stderr\": 0.006735003721960345\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.07518875838926174,\n \"em_stderr\": 0.002700490526265294,\n\
\ \"f1\": 0.10838401845637569,\n \"f1_stderr\": 0.0028760995167941457\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6424625098658248,\n\
\ \"acc_stderr\": 0.01347000744392069\n }\n}\n```"
repo_url: https://huggingface.co/MayaPH/opt-flan-iml-6.7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T03_06_32.697788
path:
- '**/details_harness|drop|3_2023-10-13T03-06-32.697788.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T03-06-32.697788.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T03_06_32.697788
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-06-32.697788.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-06-32.697788.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T03_06_32.697788
path:
- '**/details_harness|winogrande|5_2023-10-13T03-06-32.697788.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T03-06-32.697788.parquet'
- config_name: results
data_files:
- split: 2023_10_13T03_06_32.697788
path:
- results_2023-10-13T03-06-32.697788.parquet
- split: latest
path:
- results_2023-10-13T03-06-32.697788.parquet
---
# Dataset Card for Evaluation run of MayaPH/opt-flan-iml-6.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MayaPH/opt-flan-iml-6.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MayaPH/opt-flan-iml-6.7b](https://huggingface.co/MayaPH/opt-flan-iml-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MayaPH__opt-flan-iml-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T03:06:32.697788](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__opt-flan-iml-6.7b/blob/main/results_2023-10-13T03-06-32.697788.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07518875838926174,
"em_stderr": 0.002700490526265294,
"f1": 0.10838401845637569,
"f1_stderr": 0.0028760995167941457,
"acc": 0.3212312549329124,
"acc_stderr": 0.006735003721960345
},
"harness|drop|3": {
"em": 0.07518875838926174,
"em_stderr": 0.002700490526265294,
"f1": 0.10838401845637569,
"f1_stderr": 0.0028760995167941457
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6424625098658248,
"acc_stderr": 0.01347000744392069
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Medilora/mimic_iii_diagnosis_anonymous | ---
license: mit
---
|
xlangai/arks_data | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- arks_data
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
configs:
- config_name: Pony
data_files:
- split: docs
path: "Pony/Pony_docs.jsonl"
- split: queries
path: "Pony/Pony_queries.jsonl"
- config_name: Ring
data_files:
- split: docs
path: "Ring/Ring_docs.jsonl"
- split: queries
path: "Ring/Ring_queries.jsonl"
- config_name: ScipyM
data_files:
- split: docs
path: "ScipyM/ScipyM_docs.jsonl"
- split: queries
path: "ScipyM/ScipyM_queries.jsonl"
- config_name: TensorflowM
data_files:
- split: docs
path: "TensorflowM/TensorflowM_docs.jsonl"
- split: queries
path: "TensorflowM/TensorflowM_queries.jsonl"
---
# Dataset Card for Dataset Name
This dataset contains 4 sub-datasets, namely Pony, Ring, ScipyM, TensorflowM. You can find more information about this dataset from our paper **"ARKS: Active Retrieval in Knowledge Soup for Code Generation"**
paper Arxiv link: https://arxiv.org/abs/2402.12317
paper website: https://arks-codegen.github.io
# How to load this dataset
load one dataset:
```
from datasets import load_dataset
data_files = {"corpus": "Pony/Pony_docs.jsonl"}
dataset = load_dataset("xlangai/arks_data", data_files=data_files)
```
load several datasets:
```
from datasets import load_dataset
data_files = {"corpus": ["Pony/Pony_docs.jsonl", "Ring/Ring_docs.jsonl"]}
dataset = load_dataset("xlangai/arks_data", data_files=data_files)
``` |
bjoernp/laion-2b-mistral_captions-1.3M | ---
dataset_info:
features:
- name: TEXT
dtype: string
- name: RESPONSE
dtype: string
- name: captions
sequence: string
splits:
- name: train
num_bytes: 853385896.3491833
num_examples: 1318108
download_size: 540262191
dataset_size: 853385896.3491833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "laion-2b-mistral_captions-1.3M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bstds/us_patent | ---
dataset_info:
features:
- name: id
dtype: string
- name: anchor
dtype: string
- name: target
dtype: string
- name: context
dtype: string
- name: score
dtype: float32
splits:
- name: train
num_bytes: 2580483
num_examples: 36473
- name: test
num_bytes: 2521
num_examples: 36
download_size: 1161327
dataset_size: 2583004
---
# Dataset Card for "us_patent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Dataset of the U.S. Patent Phrase to Phrase Matching - https://www.kaggle.com/competitions/us-patent-phrase-to-phrase-matching |
arieg/bw_spec_cls_4_22_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1706'
'1': '1720'
'2': '1732'
'3': '1733'
splits:
- name: train
num_bytes: 43566639.0
num_examples: 800
- name: test
num_bytes: 1095432.0
num_examples: 20
download_size: 38693515
dataset_size: 44662071.0
---
# Dataset Card for "bw_spec_cls_4_22_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
veezbo/phinc | ---
license: cc-by-4.0
task_categories:
- translation
- text2text-generation
language:
- en
- hi
pretty_name: A Parallel Hinglish Social Media Code-Mixed Corpus for Machine Translation
size_categories:
- 10K<n<100K
---
# Description
PHINC is a parallel corpus for machine translation pairing code-mixed Hinglish (a fusion of Hindi and English commonly used in modern India) with human-generated English translations.
# Credit
All credit goes to:
[PHINC: A Parallel Hinglish Social Media Code-Mixed Corpus for Machine Translation](https://aclanthology.org/2020.wnut-1.7) (Srivastava & Singh, WNUT 2020)
# Original Abstract
Code-mixing is the phenomenon of using more than one language in a sentence. It is a very frequently observed pattern of communication on social media platforms. Flexibility to use mixed languages in one text message might help to communicate efficiently with the target audience. But, it adds to the challenge of processing and understanding natural language to a much larger extent. Here, we are presenting a parallel corpus of the 13,738 code-mixed English-Hindi sentences and their corresponding translation in English. The translations of sentences are done manually by the annotators. We are releasing the parallel corpus to facilitate future research opportunities for code-mixed machine translation.
## Note
This data has been automatically modified to become a HuggingFace dataset (including a conversion to Parquet). The original raw dataset can be found [here](https://zenodo.org/record/3605597). |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.