datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
d0rj/conala-mined-ru | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: parent_answer_post_id
dtype: int64
- name: prob
dtype: float64
- name: snippet
dtype: string
- name: intent
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 115298717.0
num_examples: 593891
download_size: 78541384
dataset_size: 115298717.0
---
# Dataset Card for "conala-mined-ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-samsum-samsum-ec1044-95879146522 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: Joemgu/mlong-t5-large-sumstew
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Joemgu/mlong-t5-large-sumstew
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@baohuynhbk14](https://huggingface.co/baohuynhbk14) for evaluating this model. |
CyberHarem/mutsu_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mutsu/陸奥 (Kantai Collection)
This is the dataset of mutsu/陸奥 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, green_eyes, breasts, large_breasts, headgear, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 608.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsu_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 377.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsu_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1228 | 781.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsu_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 551.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsu_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1228 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsu_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mutsu_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, chain, looking_at_viewer, pleated_skirt, solo, white_gloves, black_skirt, smile, midriff, miniskirt, red_thighhighs, radio_antenna, simple_background, turret, cleavage, navel, rigging, white_background, blush, cannon |
| 1 | 39 |  |  |  |  |  | midriff, solo, 1girl, looking_at_viewer, white_gloves, bare_shoulders, miniskirt, hairband, navel, chain, red_thighhighs, smile |
| 2 | 14 |  |  |  |  |  | 1girl, collarbone, looking_at_viewer, solo, smile, blush, simple_background, white_background, closed_mouth, cleavage, upper_body, alternate_costume |
| 3 | 8 |  |  |  |  |  | 1girl, cloud, day, looking_at_viewer, outdoors, solo, blush, collarbone, cowboy_shot, navel, smile, beach, black_bikini, cleavage, ocean, side-tie_bikini_bottom, closed_mouth, flipped_hair, blue_sky, open_mouth |
| 4 | 15 |  |  |  |  |  | 1girl, huge_breasts, looking_at_viewer, solo, thick_thighs, curvy, thighhighs, navel, cleavage, wide_hips, bikini, gloves, smile, tan, collar |
| 5 | 23 |  |  |  |  |  | 1girl, solo, alternate_costume, looking_at_viewer, smile, obi, wide_sleeves, blush, long_sleeves, yukata, floral_print, hair_flower, striped, upper_body |
| 6 | 15 |  |  |  |  |  | 1girl, fake_animal_ears, playboy_bunny, rabbit_ears, solo, detached_collar, wrist_cuffs, black_leotard, looking_at_viewer, bowtie, cleavage, pantyhose, simple_background, smile, strapless_leotard, blush, rabbit_tail, white_background, collarbone, cowboy_shot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | chain | looking_at_viewer | pleated_skirt | solo | white_gloves | black_skirt | smile | midriff | miniskirt | red_thighhighs | radio_antenna | simple_background | turret | cleavage | navel | rigging | white_background | blush | cannon | bare_shoulders | hairband | collarbone | closed_mouth | upper_body | alternate_costume | cloud | day | outdoors | cowboy_shot | beach | black_bikini | ocean | side-tie_bikini_bottom | flipped_hair | blue_sky | open_mouth | huge_breasts | thick_thighs | curvy | thighhighs | wide_hips | bikini | gloves | tan | collar | obi | wide_sleeves | long_sleeves | yukata | floral_print | hair_flower | striped | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | wrist_cuffs | black_leotard | bowtie | pantyhose | strapless_leotard | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:----------------|:-------|:---------------|:--------------|:--------|:----------|:------------|:-----------------|:----------------|:--------------------|:---------|:-----------|:--------|:----------|:-------------------|:--------|:---------|:-----------------|:-----------|:-------------|:---------------|:-------------|:--------------------|:--------|:------|:-----------|:--------------|:--------|:---------------|:--------|:-------------------------|:---------------|:-----------|:-------------|:---------------|:---------------|:--------|:-------------|:------------|:---------|:---------|:------|:---------|:------|:---------------|:---------------|:---------|:---------------|:--------------|:----------|:-------------------|:----------------|:--------------|:------------------|:--------------|:----------------|:---------|:------------|:--------------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 39 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | X | | | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | X | X | | | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 23 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | X | | | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
fformosa/LSUN_bedroom_VQA_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: size
sequence: int64
- name: proportion
dtype: float64
- name: new_image_id
dtype: int64
- name: new_attributes
sequence: string
splits:
- name: train
num_bytes: 836782860
num_examples: 58266
download_size: 815988446
dataset_size: 836782860
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- question-answering
- zero-shot-classification
- feature-extraction
- visual-question-answering
- image-classification
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for "CSUN_bedroom_VQA_feliu_v2"
 |
open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser | ---
pretty_name: Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T14:37:51.781058](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser/blob/main/results_2024-02-14T14-37-51.781058.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.645893629951843,\n\
\ \"acc_stderr\": 0.03226186920457953,\n \"acc_norm\": 0.6452717180915718,\n\
\ \"acc_norm_stderr\": 0.03293869736005958,\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.01690969358024884,\n \"mc2\": 0.7814534342142354,\n\
\ \"mc2_stderr\": 0.013720660446321574\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344003,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274772\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7005576578370842,\n\
\ \"acc_stderr\": 0.004570777326263903,\n \"acc_norm\": 0.8850826528579964,\n\
\ \"acc_norm_stderr\": 0.00318270383035113\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406762,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406762\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.01661139368726858,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.01661139368726858\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079072,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079072\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080632,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080632\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.01690969358024884,\n \"mc2\": 0.7814534342142354,\n\
\ \"mc2_stderr\": 0.013720660446321574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.012731710925078134\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|arc:challenge|25_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|gsm8k|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hellaswag|10_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T14-37-51.781058.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- '**/details_harness|winogrande|5_2024-02-14T14-37-51.781058.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T14-37-51.781058.parquet'
- config_name: results
data_files:
- split: 2024_02_14T14_37_51.781058
path:
- results_2024-02-14T14-37-51.781058.parquet
- split: latest
path:
- results_2024-02-14T14-37-51.781058.parquet
---
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T14:37:51.781058](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser/blob/main/results_2024-02-14T14-37-51.781058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.645893629951843,
"acc_stderr": 0.03226186920457953,
"acc_norm": 0.6452717180915718,
"acc_norm_stderr": 0.03293869736005958,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024884,
"mc2": 0.7814534342142354,
"mc2_stderr": 0.013720660446321574
},
"harness|arc:challenge|25": {
"acc": 0.6911262798634812,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274772
},
"harness|hellaswag|10": {
"acc": 0.7005576578370842,
"acc_stderr": 0.004570777326263903,
"acc_norm": 0.8850826528579964,
"acc_norm_stderr": 0.00318270383035113
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406762,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406762
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.01661139368726858,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.01661139368726858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079072,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079072
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080632,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080632
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024884,
"mc2": 0.7814534342142354,
"mc2_stderr": 0.013720660446321574
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
syeda-raisa/paraphrase_final | ---
license: apache-2.0
---
#idiom paraphrase |
socius/VADER_2014 | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- sentiment
- social media
size_categories:
- 1K<n<10K
---
|
open-llm-leaderboard/details_xzuyn__GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M | ---
pretty_name: Evaluation run of xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M](https://huggingface.co/xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T07:34:24.161560](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M/blob/main/results_2023-12-12T07-34-24.161560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2574568669647554,\n\
\ \"acc_stderr\": 0.03077370055460631,\n \"acc_norm\": 0.258555233951132,\n\
\ \"acc_norm_stderr\": 0.031529415596877046,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.38841425251899087,\n\
\ \"mc2_stderr\": 0.014859033830490542\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1945392491467577,\n \"acc_stderr\": 0.011567709174648728,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730113\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2842063333997212,\n\
\ \"acc_stderr\": 0.004501137895230712,\n \"acc_norm\": 0.29426409081856203,\n\
\ \"acc_norm_stderr\": 0.00454779896412667\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740234,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740234\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.02461829819586651,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.15,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.0292418838696288,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.0292418838696288\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240019,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240019\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3064516129032258,\n \"acc_stderr\": 0.026226485652553883,\n \"\
acc_norm\": 0.3064516129032258,\n \"acc_norm_stderr\": 0.026226485652553883\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"\
acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3434343434343434,\n \"acc_stderr\": 0.033832012232444426,\n \"\
acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.033832012232444426\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.034998072761933396,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.034998072761933396\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380565,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380565\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3211009174311927,\n \"acc_stderr\": 0.020018149772733744,\n \"\
acc_norm\": 0.3211009174311927,\n \"acc_norm_stderr\": 0.020018149772733744\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2320675105485232,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635464,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635464\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.0258192332564837,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.0258192332564837\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.016050792148036546,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.016050792148036546\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807114,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543332,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543332\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140245,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140245\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23663624511082137,\n\
\ \"acc_stderr\": 0.010855137351572735,\n \"acc_norm\": 0.23663624511082137,\n\
\ \"acc_norm_stderr\": 0.010855137351572735\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016643,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016643\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.17272727272727273,\n\
\ \"acc_stderr\": 0.0362069183392922,\n \"acc_norm\": 0.17272727272727273,\n\
\ \"acc_norm_stderr\": 0.0362069183392922\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.03070982405056527,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.03070982405056527\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.38841425251899087,\n\
\ \"mc2_stderr\": 0.014859033830490542\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.49013417521704816,\n \"acc_stderr\": 0.014049749833367589\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848635\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|arc:challenge|25_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|gsm8k|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hellaswag|10_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T07-34-24.161560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T07-34-24.161560.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- '**/details_harness|winogrande|5_2023-12-12T07-34-24.161560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T07-34-24.161560.parquet'
- config_name: results
data_files:
- split: 2023_12_12T07_34_24.161560
path:
- results_2023-12-12T07-34-24.161560.parquet
- split: latest
path:
- results_2023-12-12T07-34-24.161560.parquet
---
# Dataset Card for Evaluation run of xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M](https://huggingface.co/xzuyn/GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T07:34:24.161560](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__GPT-2-SlimOrcaDeduped-airoboros-3.1-MetaMathQA-SFT-124M/blob/main/results_2023-12-12T07-34-24.161560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2574568669647554,
"acc_stderr": 0.03077370055460631,
"acc_norm": 0.258555233951132,
"acc_norm_stderr": 0.031529415596877046,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.38841425251899087,
"mc2_stderr": 0.014859033830490542
},
"harness|arc:challenge|25": {
"acc": 0.1945392491467577,
"acc_stderr": 0.011567709174648728,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730113
},
"harness|hellaswag|10": {
"acc": 0.2842063333997212,
"acc_stderr": 0.004501137895230712,
"acc_norm": 0.29426409081856203,
"acc_norm_stderr": 0.00454779896412667
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740234,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740234
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.2,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.0292418838696288,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.0292418838696288
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240019,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240019
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3064516129032258,
"acc_stderr": 0.026226485652553883,
"acc_norm": 0.3064516129032258,
"acc_norm_stderr": 0.026226485652553883
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3434343434343434,
"acc_stderr": 0.033832012232444426,
"acc_norm": 0.3434343434343434,
"acc_norm_stderr": 0.033832012232444426
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.034998072761933396,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.034998072761933396
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380565,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380565
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3211009174311927,
"acc_stderr": 0.020018149772733744,
"acc_norm": 0.3211009174311927,
"acc_norm_stderr": 0.020018149772733744
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.04139112727635464,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.04139112727635464
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.0258192332564837,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.0258192332564837
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036546,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036546
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807114,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543332,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140245,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140245
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23663624511082137,
"acc_stderr": 0.010855137351572735,
"acc_norm": 0.23663624511082137,
"acc_norm_stderr": 0.010855137351572735
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016643,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016643
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.17272727272727273,
"acc_stderr": 0.0362069183392922,
"acc_norm": 0.17272727272727273,
"acc_norm_stderr": 0.0362069183392922
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056527,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056527
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.38841425251899087,
"mc2_stderr": 0.014859033830490542
},
"harness|winogrande|5": {
"acc": 0.49013417521704816,
"acc_stderr": 0.014049749833367589
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bys2058/mydataset_bai | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 672992285.125
num_examples: 1375
download_size: 672547372
dataset_size: 672992285.125
---
# Dataset Card for "mydataset_bai"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deivsu/2ne1CL | ---
license: openrail
---
|
joey234/mmlu-human_sexuality-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 3185.297709923664
num_examples: 13
download_size: 6364
dataset_size: 3185.297709923664
---
# Dataset Card for "mmlu-human_sexuality-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ciempiess/voxforge_spanish | ---
license: gpl-3.0
---
|
wmt/wmt16 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- cs
- de
- en
- fi
- ro
- ru
- tr
license:
- unknown
multilinguality:
- translation
size_categories:
- 10M<n<100M
source_datasets:
- extended|europarl_bilingual
- extended|news_commentary
- extended|setimes
- extended|un_multi
task_categories:
- translation
task_ids: []
paperswithcode_id: wmt-2016
pretty_name: WMT16
dataset_info:
- config_name: cs-en
features:
- name: translation
dtype:
translation:
languages:
- cs
- en
splits:
- name: train
num_bytes: 295995226
num_examples: 997240
- name: validation
num_bytes: 572195
num_examples: 2656
- name: test
num_bytes: 707862
num_examples: 2999
download_size: 178250444
dataset_size: 297275283
- config_name: de-en
features:
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: train
num_bytes: 1373099816
num_examples: 4548885
- name: validation
num_bytes: 522981
num_examples: 2169
- name: test
num_bytes: 735508
num_examples: 2999
download_size: 827152589
dataset_size: 1374358305
- config_name: fi-en
features:
- name: translation
dtype:
translation:
languages:
- fi
- en
splits:
- name: train
num_bytes: 605145153
num_examples: 2073394
- name: validation
num_bytes: 306327
num_examples: 1370
- name: test
num_bytes: 1410507
num_examples: 6000
download_size: 348306427
dataset_size: 606861987
- config_name: ro-en
features:
- name: translation
dtype:
translation:
languages:
- ro
- en
splits:
- name: train
num_bytes: 188287711
num_examples: 610320
- name: validation
num_bytes: 561791
num_examples: 1999
- name: test
num_bytes: 539208
num_examples: 1999
download_size: 108584039
dataset_size: 189388710
- config_name: ru-en
features:
- name: translation
dtype:
translation:
languages:
- ru
- en
splits:
- name: train
num_bytes: 448322024
num_examples: 1516162
- name: validation
num_bytes: 955964
num_examples: 2818
- name: test
num_bytes: 1050669
num_examples: 2998
download_size: 231557371
dataset_size: 450328657
- config_name: tr-en
features:
- name: translation
dtype:
translation:
languages:
- tr
- en
splits:
- name: train
num_bytes: 60416449
num_examples: 205756
- name: validation
num_bytes: 240642
num_examples: 1001
- name: test
num_bytes: 732428
num_examples: 3000
download_size: 37389436
dataset_size: 61389519
configs:
- config_name: cs-en
data_files:
- split: train
path: cs-en/train-*
- split: validation
path: cs-en/validation-*
- split: test
path: cs-en/test-*
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- split: validation
path: de-en/validation-*
- split: test
path: de-en/test-*
- config_name: fi-en
data_files:
- split: train
path: fi-en/train-*
- split: validation
path: fi-en/validation-*
- split: test
path: fi-en/test-*
- config_name: ro-en
data_files:
- split: train
path: ro-en/train-*
- split: validation
path: ro-en/validation-*
- split: test
path: ro-en/test-*
- config_name: ru-en
data_files:
- split: train
path: ru-en/train-*
- split: validation
path: ru-en/validation-*
- split: test
path: ru-en/test-*
- config_name: tr-en
data_files:
- split: train
path: tr-en/train-*
- split: validation
path: tr-en/validation-*
- split: test
path: tr-en/test-*
---
# Dataset Card for "wmt16"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.statmt.org/wmt16/translation-task.html](http://www.statmt.org/wmt16/translation-task.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.69 GB
- **Size of the generated dataset:** 297.28 MB
- **Total amount of disk used:** 1.99 GB
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Warning:</b> There are issues with the Common Crawl corpus data (<a href="https://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz">training-parallel-commoncrawl.tgz</a>):</p>
<ul>
<li>Non-English files contain many English sentences.</li>
<li>Their "parallel" sentences in English are not aligned: they are uncorrelated with their counterpart.</li>
</ul>
<p>We have contacted the WMT organizers, and in response, they have indicated that they do not have plans to update the Common Crawl corpus data. Their rationale pertains to the expectation that such data has been superseded, primarily by CCMatrix, and to some extent, by ParaCrawl datasets.</p>
</div>
Translation dataset based on the data from statmt.org.
Versions exist for different years using a combination of data
sources. The base `wmt` allows you to create a custom dataset by choosing
your own data/language pair. This can be done as follows:
```python
from datasets import inspect_dataset, load_dataset_builder
inspect_dataset("wmt16", "path/to/scripts")
builder = load_dataset_builder(
"path/to/scripts/wmt_utils.py",
language_pair=("fr", "de"),
subsets={
datasets.Split.TRAIN: ["commoncrawl_frde"],
datasets.Split.VALIDATION: ["euelections_dev2019"],
},
)
# Standard version
builder.download_and_prepare()
ds = builder.as_dataset()
# Streamable version
ds = builder.as_streaming_dataset()
```
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cs-en
- **Size of downloaded dataset files:** 1.69 GB
- **Size of the generated dataset:** 297.28 MB
- **Total amount of disk used:** 1.99 GB
An example of 'validation' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### cs-en
- `translation`: a multilingual `string` variable, with possible languages including `cs`, `en`.
### Data Splits
|name |train |validation|test|
|-----|-----:|---------:|---:|
|cs-en|997240| 2656|2999|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{bojar-EtAl:2016:WMT1,
author = {Bojar, Ond
{r}ej and Chatterjee, Rajen and Federmann, Christian and Graham, Yvette and Haddow, Barry and Huck, Matthias and Jimeno Yepes, Antonio and Koehn, Philipp and Logacheva, Varvara and Monz, Christof and Negri, Matteo and Neveol, Aurelie and Neves, Mariana and Popel, Martin and Post, Matt and Rubino, Raphael and Scarton, Carolina and Specia, Lucia and Turchi, Marco and Verspoor, Karin and Zampieri, Marcos},
title = {Findings of the 2016 Conference on Machine Translation},
booktitle = {Proceedings of the First Conference on Machine Translation},
month = {August},
year = {2016},
address = {Berlin, Germany},
publisher = {Association for Computational Linguistics},
pages = {131--198},
url = {http://www.aclweb.org/anthology/W/W16/W16-2301}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
TinyPixel/lima_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2960580
num_examples: 1030
download_size: 1703604
dataset_size: 2960580
---
# Dataset Card for "lima_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kbharat7/DogChestXrayDatasetNew | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 444263783.625
num_examples: 1395
download_size: 444252328
dataset_size: 444263783.625
---
# Dataset Card for "DogChestXrayDatasetNew"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-76c05b-14906070 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/roberta-base-squad2-covid
metrics: ['bertscore']
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/roberta-base-squad2-covid
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
UsmanMalikLearn/Pak-Legal-Llama2 | ---
dataset_info:
features:
- name: transformed_text
dtype: string
splits:
- name: train
num_bytes: 857655
num_examples: 2229
download_size: 369985
dataset_size: 857655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
youlive789/sales | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edited_image
dtype: image
- name: edit_promt
dtype: string
splits:
- name: train
num_bytes: 111237824.0
num_examples: 140
download_size: 111236059
dataset_size: 111237824.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Techatach/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_context_v5_full_recite_full_passage_random_permute_rerun_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 6222942.0
num_examples: 2385
- name: validation
num_bytes: 808532
num_examples: 300
download_size: 1374285
dataset_size: 7031474.0
---
# Dataset Card for "squad_qa_context_v5_full_recite_full_passage_random_permute_rerun_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Xokito/datasethevelyn | ---
license: openrail
---
|
CyberHarem/le_malin_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of le_malin/ル・マラン/恶毒 (Azur Lane)
This is the dataset of le_malin/ル・マラン/恶毒 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, blue_eyes, white_hair, very_long_hair, bangs, breasts, small_breasts, animal_ears, rabbit_ears, braid, twin_braids, fake_animal_ears, hair_bun, double_bun, animal_ear_fluff`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 933.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_malin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 419.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_malin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1378 | 988.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_malin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 767.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/le_malin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1378 | 1.52 GiB | [Download](https://huggingface.co/datasets/CyberHarem/le_malin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/le_malin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | +_+, 1girl, choker, criss-cross_halter, solo, white_dress, white_rose, bare_shoulders, blush, hair_between_eyes, looking_at_viewer, sleeveless_dress, collarbone, sidelocks, bare_arms, parted_lips, barefoot, grey_hair, hand_up, on_back |
| 1 | 17 |  |  |  |  |  | +_+, 1girl, solo, white_pantyhose, sidelocks, white_dress, hair_between_eyes, looking_at_viewer, long_sleeves, symbol-shaped_pupils, blush, navel, detached_sleeves, parted_lips, sitting, tutu |
| 2 | 13 |  |  |  |  |  | +_+, 1girl, solo, white_dress, white_pantyhose, hair_between_eyes, long_sleeves, sidelocks, holding_sword, looking_at_viewer, thighband_pantyhose, closed_mouth, rapier, simple_background, underboob_cutout, blush, white_background, cape, hair_bow, full_body, pocket_watch, ribbon |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, navel, nipples, nude, simple_background, solo, +_+, hair_between_eyes, blush, collarbone, hair_ribbon, loli, open_mouth, white_background, anus, clitoris, closed_mouth, spread_pussy, standing, symbol-shaped_pupils, uncensored |
| 4 | 18 |  |  |  |  |  | +_+, 1girl, bare_shoulders, solo, looking_at_viewer, white_bikini, symbol-shaped_pupils, sidelocks, barefoot, blush, bare_legs, headphones_around_neck, hammock, arm_garter, on_back, hair_between_eyes, navel, ass, feet, thighs, purple_flower, stomach |
| 5 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, official_alternate_costume, playboy_bunny, solo, white_leotard, white_pantyhose, wrist_cuffs, blush, simple_background, white_background, choker, sitting, +_+, closed_mouth, thigh_strap, twintails, hairband, rabbit_tail |
| 6 | 10 |  |  |  |  |  | 1girl, ass, looking_at_viewer, official_alternate_costume, playboy_bunny, rabbit_tail, solo, white_leotard, white_pantyhose, from_behind, hairband, looking_back, +_+, simple_background, thigh_strap, white_background, wrist_cuffs, blush, parted_lips, sideboob, thighs |
| 7 | 7 |  |  |  |  |  | 1girl, covered_navel, looking_at_viewer, official_alternate_costume, playboy_bunny, simple_background, solo, white_leotard, white_pantyhose, blush, thigh_strap, white_background, wrist_cuffs, choker, closed_mouth, highleg_leotard, +_+, thighs, twintails |
| 8 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, official_alternate_costume, playboy_bunny, rabbit_tail, solo, white_leotard, white_pantyhose, +_+, ass, from_behind, looking_back, no_shoes, wrist_cuffs, couch, blush, full_body, low_twin_braids, sideboob, thigh_strap, wariza, white_hairband |
| 9 | 16 |  |  |  |  |  | 1girl, hetero, solo_focus, penis, sex, 1boy, blush, censored, nipples, +_+, pussy, vaginal, open_mouth, pantyhose, official_alternate_costume, straddling, loli, looking_at_viewer, playboy_bunny, simple_background, spread_legs, clothed_female_nude_male, girl_on_top, white_leotard, crotchless, dark-skinned_male, interracial, navel, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | +_+ | 1girl | choker | criss-cross_halter | solo | white_dress | white_rose | bare_shoulders | blush | hair_between_eyes | looking_at_viewer | sleeveless_dress | collarbone | sidelocks | bare_arms | parted_lips | barefoot | grey_hair | hand_up | on_back | white_pantyhose | long_sleeves | symbol-shaped_pupils | navel | detached_sleeves | sitting | tutu | holding_sword | thighband_pantyhose | closed_mouth | rapier | simple_background | underboob_cutout | white_background | cape | hair_bow | full_body | pocket_watch | ribbon | nipples | nude | hair_ribbon | loli | open_mouth | anus | clitoris | spread_pussy | standing | uncensored | white_bikini | bare_legs | headphones_around_neck | hammock | arm_garter | ass | feet | thighs | purple_flower | stomach | official_alternate_costume | playboy_bunny | white_leotard | wrist_cuffs | thigh_strap | twintails | hairband | rabbit_tail | from_behind | looking_back | sideboob | covered_navel | highleg_leotard | no_shoes | couch | low_twin_braids | wariza | white_hairband | hetero | solo_focus | penis | sex | 1boy | censored | pussy | vaginal | pantyhose | straddling | spread_legs | clothed_female_nude_male | girl_on_top | crotchless | dark-skinned_male | interracial |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------|:--------|:---------|:---------------------|:-------|:--------------|:-------------|:-----------------|:--------|:--------------------|:--------------------|:-------------------|:-------------|:------------|:------------|:--------------|:-----------|:------------|:----------|:----------|:------------------|:---------------|:-----------------------|:--------|:-------------------|:----------|:-------|:----------------|:----------------------|:---------------|:---------|:--------------------|:-------------------|:-------------------|:-------|:-----------|:------------|:---------------|:---------|:----------|:-------|:--------------|:-------|:-------------|:-------|:-----------|:---------------|:-----------|:-------------|:---------------|:------------|:-------------------------|:----------|:-------------|:------|:-------|:---------|:----------------|:----------|:-----------------------------|:----------------|:----------------|:--------------|:--------------|:------------|:-----------|:--------------|:--------------|:---------------|:-----------|:----------------|:------------------|:-----------|:--------|:------------------|:---------|:-----------------|:---------|:-------------|:--------|:------|:-------|:-----------|:--------|:----------|:------------|:-------------|:--------------|:---------------------------|:--------------|:-------------|:--------------------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | | | X | X | | | X | X | X | | | X | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | | | X | X | | | X | X | X | | | X | | | | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | | | | X | X | X | | X | | | | | | | | | | X | X | | | | | | X | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 18 |  |  |  |  |  | X | X | | | X | | | X | X | X | X | | | X | | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | X | | X | | | | X | | X | | | | | | | | | | X | | | | | X | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | | | X | | | | X | | X | | | | | X | | | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | | X | | | | X | | X | | | | | | | | | | X | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | | | X | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | | | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 9 | 16 |  |  |  |  |  | X | X | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
davisonye/cg_pad_mai_dav_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': maize
'1': paddy_rice
'2': unknown
splits:
- name: train
num_bytes: 5026074267.1
num_examples: 1683
- name: validation
num_bytes: 308905569.0
num_examples: 89
- name: test
num_bytes: 330351357.0
num_examples: 96
download_size: 6239113776
dataset_size: 5665331193.1
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-futin__guess-en-6f8c6a-2012266598 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-2.7b
metrics: []
dataset_name: futin/guess
dataset_config: en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-2.7b
* Dataset: futin/guess
* Config: en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
ctang/1k_deontology_train_llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 679786
num_examples: 1000
download_size: 84465
dataset_size: 679786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceH4/ifeval | ---
license: apache-2.0
---
# Dataset Card for IFEval
<!-- Provide a quick summary of the dataset. -->
This dataset contains the prompts used in Google's [Instruction-Following Evaluation for Large Language Models](https://arxiv.org/abs/2311.07911). You can find the original source data and evaluation code [here](https://github.com/google-research/google-research/tree/master/instruction_following_eval).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf | ---
pretty_name: Evaluation run of porkorbeef/Llama-2-13b-sf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [porkorbeef/Llama-2-13b-sf](https://huggingface.co/porkorbeef/Llama-2-13b-sf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T10:54:43.878914](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf/blob/main/results_2023-10-19T10-54-43.878914.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 2.3070469798657717e-05,\n \"f1_stderr\"\
: 1.0056123859027576e-05,\n \"acc\": 0.2517758484609313,\n \"acc_stderr\"\
: 0.0070260655734579345\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 2.3070469798657717e-05,\n \"\
f1_stderr\": 1.0056123859027576e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5035516969218626,\n \"acc_stderr\": 0.014052131146915869\n\
\ }\n}\n```"
repo_url: https://huggingface.co/porkorbeef/Llama-2-13b-sf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T10_54_43.878914
path:
- '**/details_harness|drop|3_2023-10-19T10-54-43.878914.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T10-54-43.878914.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T10_54_43.878914
path:
- '**/details_harness|gsm8k|5_2023-10-19T10-54-43.878914.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T10-54-43.878914.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T10_54_43.878914
path:
- '**/details_harness|winogrande|5_2023-10-19T10-54-43.878914.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T10-54-43.878914.parquet'
- config_name: results
data_files:
- split: 2023_10_19T10_54_43.878914
path:
- results_2023-10-19T10-54-43.878914.parquet
- split: latest
path:
- results_2023-10-19T10-54-43.878914.parquet
---
# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b-sf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/porkorbeef/Llama-2-13b-sf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [porkorbeef/Llama-2-13b-sf](https://huggingface.co/porkorbeef/Llama-2-13b-sf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T10:54:43.878914](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf/blob/main/results_2023-10-19T10-54-43.878914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 2.3070469798657717e-05,
"f1_stderr": 1.0056123859027576e-05,
"acc": 0.2517758484609313,
"acc_stderr": 0.0070260655734579345
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 2.3070469798657717e-05,
"f1_stderr": 1.0056123859027576e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915869
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kkcosmos/instagram-images-with-captions | ---
license: unknown
dataset_info:
features:
- name: image
dtype: image
- name: item_id
dtype: int64
- name: image_path
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3516512787.609302
num_examples: 28360
- name: test
num_bytes: 391305063.2386981
num_examples: 3152
download_size: 0
dataset_size: 3907817850.848
---
|
bhjhk/samino11 | ---
license: bigscience-bloom-rail-1.0
---
|
AdapterOcean/med_alpaca_standardized_cluster_68_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 16970130
num_examples: 24531
download_size: 8602932
dataset_size: 16970130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_68_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ivanfdzm/Arq-Style | ---
license: afl-3.0
---
|
Seanxh/twitter_dataset_1713173437 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25981
num_examples: 64
download_size: 15681
dataset_size: 25981
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chirp-watai/audio_dataset | ---
task_categories:
- zero-shot-classification
tags:
- audio
- sound
pretty_name: audio
size_categories:
- 1K<n<10K
---
# Audio Dataset
This dataset consists of audio data for the following categories:
* Coughing
* Running water
* Toilet flush
* Other sounds
Although this data is unbalanced, data augmentations can be added to process the data for audio classification. The file structure looks as follows:
\- audio/
\- coughing/
\- toilet_flush/
\- running_water/
\- other_1/
\- other_2/ |
AdapterOcean/med_alpaca_standardized_cluster_10_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 18079246
num_examples: 25458
download_size: 9399571
dataset_size: 18079246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_10_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Capsekai/Badge_crafts | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
- image-classification
language:
- en
tags:
- badges
- crafts
pretty_name: Badge Craft Dataset
size_categories:
- 1K<n<10K
---
# Do what you will with the data this is old photos of crafts I used to make - just abide by the liscence above and you good to go! |
dlwh/Multi_Legal_Pile | ---
license: cc-by-4.0
---
|
luckeciano/hermes-features-ultrafeedback | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: valid
path: "eval.csv"
- split: test
path: "test.csv"
---
---
license: apache-2.0
--- |
AIGym/TextbookChapters | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1013875541
num_examples: 77932
download_size: 504483603
dataset_size: 1013875541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lshowway/wikipedia.reorder.svo.de | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2385745587
num_examples: 1137317
download_size: 1063402086
dataset_size: 2385745587
---
# Dataset Card for "wikipedia.reorder.svo.de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pvduy/exp_dpo_1 | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 120210099
num_examples: 14313
- name: test
num_bytes: 4538037
num_examples: 750
download_size: 64945705
dataset_size: 124748136
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
GVJahnavi/Crops_set | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Corn_(maize)___Cercospora_leaf_spot Gray_leaf_spot
'1': Corn_(maize)___Common_rust_
'2': Corn_(maize)___Northern_Leaf_Blight
'3': Corn_(maize)___healthy
'4': Pepper,_bell___Bacterial_spot
'5': Pepper,_bell___healthy
'6': Potato___Early_blight
'7': Potato___Late_blight
'8': Potato___healthy
'9': Soybean___healthy
'10': Tomato___Bacterial_spot
'11': Tomato___Early_blight
'12': Tomato___Late_blight
'13': Tomato___Leaf_Mold
'14': Tomato___Septoria_leaf_spot
'15': Tomato___Spider_mites Two-spotted_spider_mite
'16': Tomato___Target_Spot
'17': Tomato___Tomato_Yellow_Leaf_Curl_Virus
'18': Tomato___Tomato_mosaic_virus
'19': Tomato___healthy
splits:
- name: train
num_bytes: 381142501.808
num_examples: 25384
- name: test
num_bytes: 93073910.262
num_examples: 6346
download_size: 514893426
dataset_size: 474216412.07000005
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
benayas/sst2_llm_v0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: idx
dtype: int32
- name: eval_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 67924009
num_examples: 67349
- name: validation
num_bytes: 1016356
num_examples: 872
- name: test
num_bytes: 2104899
num_examples: 1821
download_size: 13965924
dataset_size: 71045264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
seidouz/Diabetes | ---
license: openrail
---
Simulated dataset for the project https://github.com/bitDalei/Diabetes-Classification-with-Heterogeneous-Data
**Explanation**
- column 0: label
- column 1-576: FGM data
- column 576-587: Biomarkers data
You might notice some of consecutive rows have same biomarkers, this means that these few rows are contributed by the same patient. There are also some missing value in biomarkers, presented as '0'.
|
Aashi/Science_Q_and_A_dataset | ---
license: mit
---
This dataset has been extracted from the class 9 CBSE science book for finetuning an LLM for a chatbot application. |
notjoemartinez/Mozart | ---
license: apache-2.0
---
|
prathap-reddy/autotrain-data-climate-text-classification | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: climate-text-classification
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project climate-text-classification.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "So the way our model has currently been built, we would look to pay down debt with our free cash flow generation that we're planning on generating this year, which is around $20 million to $30 million.",
"target": 0
},
{
"text": "So we don't see any big drama on the long-term FMPs as a result of this.",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['0', '1'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1919 |
| valid | 481 |
|
ofir408/try2 | ---
license: apache-2.0
---
|
mfidabel/common_voice_16_1_semisupervised | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: predicted_sentence
dtype: string
splits:
- name: whisper.medium
num_bytes: 8271758359.625
num_examples: 18779
- name: whisper.small
num_bytes: 2763174559.625
num_examples: 18779
- name: whisper.large.v3
num_bytes: 8271776000.625
num_examples: 18779
- name: whisper.tiny
num_bytes: 8271831240.625
num_examples: 18779
download_size: 23466393916
dataset_size: 27578540160.5
configs:
- config_name: default
data_files:
- split: whisper.medium
path: data/whisper.medium-*
- split: whisper.small
path: data/whisper.small-*
- split: whisper.large.v3
path: data/whisper.large.v3-*
- split: whisper.tiny
path: data/whisper.tiny-*
---
|
PS-123/LungNodule | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 132655580.844
num_examples: 1723
download_size: 157049632
dataset_size: 132655580.844
---
# Dataset Card for "LungNodule"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quibblerquery/youTubeData | ---
license: mit
---
|
open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca | ---
pretty_name: Evaluation run of TFLai/gpt-neo-1.3B-4bit-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/gpt-neo-1.3B-4bit-alpaca](https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T17:57:15.784929](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca/blob/main/results_2023-09-23T17-57-15.784929.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.00031446531194133983,\n \"f1\": 0.05118708053691287,\n\
\ \"f1_stderr\": 0.001257884278930374,\n \"acc\": 0.2821159149890526,\n\
\ \"acc_stderr\": 0.007628169555669113\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194133983,\n\
\ \"f1\": 0.05118708053691287,\n \"f1_stderr\": 0.001257884278930374\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5619573796369376,\n \"acc_stderr\": 0.013944181296470803\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|drop|3_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T17-57-15.784929.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-57-15.784929.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|winogrande|5_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T17-57-15.784929.parquet'
- config_name: results
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- results_2023-08-18T13:07:16.687815.parquet
- split: 2023_09_23T17_57_15.784929
path:
- results_2023-09-23T17-57-15.784929.parquet
- split: latest
path:
- results_2023-09-23T17-57-15.784929.parquet
---
# Dataset Card for Evaluation run of TFLai/gpt-neo-1.3B-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/gpt-neo-1.3B-4bit-alpaca](https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T17:57:15.784929](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca/blob/main/results_2023-09-23T17-57-15.784929.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133983,
"f1": 0.05118708053691287,
"f1_stderr": 0.001257884278930374,
"acc": 0.2821159149890526,
"acc_stderr": 0.007628169555669113
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133983,
"f1": 0.05118708053691287,
"f1_stderr": 0.001257884278930374
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674233
},
"harness|winogrande|5": {
"acc": 0.5619573796369376,
"acc_stderr": 0.013944181296470803
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
devjirawat8/Air_Vehicle | ---
license: mit
---
|
CyberHarem/glasgow_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of glasgow/グラスゴー/格拉斯哥 (Azur Lane)
This is the dataset of glasgow/グラスゴー/格拉斯哥 (Azur Lane), containing 68 images and their tags.
The core tags of this character are `long_hair, twintails, bangs, breasts, black_hair, maid_headdress, ribbon, blue_eyes, bow, low_twintails, large_breasts, brown_hair, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 76.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/glasgow_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 44.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/glasgow_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 96.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/glasgow_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 68.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/glasgow_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 132.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/glasgow_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/glasgow_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 46 |  |  |  |  |  | 1girl, maid, solo, blush, looking_at_viewer, frills, simple_background, wrist_cuffs, white_background, black_dress, puffy_short_sleeves, choker, cleavage, white_apron |
| 1 | 18 |  |  |  |  |  | 1girl, blush, looking_at_viewer, white_shirt, school_uniform, plaid_skirt, red_necktie, solo, long_sleeves, pleated_skirt, collared_shirt, black_jacket, hairclip, red_skirt, blazer, hair_between_eyes, open_jacket, hair_ribbon, open_mouth, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | maid | solo | blush | looking_at_viewer | frills | simple_background | wrist_cuffs | white_background | black_dress | puffy_short_sleeves | choker | cleavage | white_apron | white_shirt | school_uniform | plaid_skirt | red_necktie | long_sleeves | pleated_skirt | collared_shirt | black_jacket | hairclip | red_skirt | blazer | hair_between_eyes | open_jacket | hair_ribbon | open_mouth | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:--------|:--------------------|:---------|:--------------------|:--------------|:-------------------|:--------------|:----------------------|:---------|:-----------|:--------------|:--------------|:-----------------|:--------------|:--------------|:---------------|:----------------|:-----------------|:---------------|:-----------|:------------|:---------|:--------------------|:--------------|:--------------|:-------------|:-----------------|
| 0 | 46 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
NathanRoll/TalkBank_CA_CABNC_1 | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 17080612876
num_examples: 100
download_size: 17100223923
dataset_size: 17080612876
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TalkBank_CA_CABNC_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datasciathlete/open-ner-english | ---
dataset_info:
features:
- name: entities
list:
- name: entity_mentions
sequence: string
- name: entity_type
dtype: string
- name: spans
sequence:
sequence: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 72484875.50541525
num_examples: 36711
- name: validation
num_bytes: 18121712.494584758
num_examples: 9178
download_size: 90106463
dataset_size: 90606588.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
cimadure/snow_removal_transactions_in_montreal | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused | ---
pretty_name: Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [stanford-oval/Llama-2-7b-WikiChat-fused](https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T05:29:49.524111](https://huggingface.co/datasets/open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused/blob/main/results_2024-01-16T05-29-49.524111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3993615462452277,\n\
\ \"acc_stderr\": 0.03426948672248777,\n \"acc_norm\": 0.4047541011703342,\n\
\ \"acc_norm_stderr\": 0.035187845841979204,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514492,\n \"mc2\": 0.46362057203759177,\n\
\ \"mc2_stderr\": 0.015792473170958256\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47696245733788395,\n \"acc_stderr\": 0.014595873205358262,\n\
\ \"acc_norm\": 0.5068259385665529,\n \"acc_norm_stderr\": 0.014610029151379813\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5690101573391755,\n\
\ \"acc_stderr\": 0.00494202620027959,\n \"acc_norm\": 0.7499502091216889,\n\
\ \"acc_norm_stderr\": 0.004321564303822422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.028156036538233217,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.028156036538233217\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n\
\ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5247706422018349,\n\
\ \"acc_stderr\": 0.021410999753635914,\n \"acc_norm\": 0.5247706422018349,\n\
\ \"acc_norm_stderr\": 0.021410999753635914\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45098039215686275,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5811965811965812,\n\
\ \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.5811965811965812,\n\
\ \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5759897828863346,\n\
\ \"acc_stderr\": 0.017672263329084222,\n \"acc_norm\": 0.5759897828863346,\n\
\ \"acc_norm_stderr\": 0.017672263329084222\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.025992472029306376,\n\
\ \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.025992472029306376\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n\
\ \"acc_stderr\": 0.01502408388332289,\n \"acc_norm\": 0.28044692737430166,\n\
\ \"acc_norm_stderr\": 0.01502408388332289\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824103,\n\
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n\
\ \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.44694533762057875,\n\
\ \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3487654320987654,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.3487654320987654,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503786,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503786\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3200782268578879,\n\
\ \"acc_stderr\": 0.011914791947638519,\n \"acc_norm\": 0.3200782268578879,\n\
\ \"acc_norm_stderr\": 0.011914791947638519\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.34967320261437906,\n \"acc_stderr\": 0.019291961895066365,\n \
\ \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.019291961895066365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5422885572139303,\n\
\ \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.5422885572139303,\n\
\ \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626544,\n\
\ \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626544\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514492,\n \"mc2\": 0.46362057203759177,\n\
\ \"mc2_stderr\": 0.015792473170958256\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822999\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225212\n }\n}\n```"
repo_url: https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|arc:challenge|25_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|gsm8k|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hellaswag|10_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T05-29-49.524111.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- '**/details_harness|winogrande|5_2024-01-16T05-29-49.524111.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T05-29-49.524111.parquet'
- config_name: results
data_files:
- split: 2024_01_16T05_29_49.524111
path:
- results_2024-01-16T05-29-49.524111.parquet
- split: latest
path:
- results_2024-01-16T05-29-49.524111.parquet
---
# Dataset Card for Evaluation run of stanford-oval/Llama-2-7b-WikiChat-fused
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [stanford-oval/Llama-2-7b-WikiChat-fused](https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T05:29:49.524111](https://huggingface.co/datasets/open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused/blob/main/results_2024-01-16T05-29-49.524111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3993615462452277,
"acc_stderr": 0.03426948672248777,
"acc_norm": 0.4047541011703342,
"acc_norm_stderr": 0.035187845841979204,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514492,
"mc2": 0.46362057203759177,
"mc2_stderr": 0.015792473170958256
},
"harness|arc:challenge|25": {
"acc": 0.47696245733788395,
"acc_stderr": 0.014595873205358262,
"acc_norm": 0.5068259385665529,
"acc_norm_stderr": 0.014610029151379813
},
"harness|hellaswag|10": {
"acc": 0.5690101573391755,
"acc_stderr": 0.00494202620027959,
"acc_norm": 0.7499502091216889,
"acc_norm_stderr": 0.004321564303822422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.028156036538233217,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.028156036538233217
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5247706422018349,
"acc_stderr": 0.021410999753635914,
"acc_norm": 0.5247706422018349,
"acc_norm_stderr": 0.021410999753635914
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5811965811965812,
"acc_stderr": 0.03232128912157792,
"acc_norm": 0.5811965811965812,
"acc_norm_stderr": 0.03232128912157792
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5759897828863346,
"acc_stderr": 0.017672263329084222,
"acc_norm": 0.5759897828863346,
"acc_norm_stderr": 0.017672263329084222
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.01502408388332289,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.01502408388332289
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824103,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.44694533762057875,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.44694533762057875,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3487654320987654,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.3487654320987654,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503786,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3200782268578879,
"acc_stderr": 0.011914791947638519,
"acc_norm": 0.3200782268578879,
"acc_norm_stderr": 0.011914791947638519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.019291961895066365,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.019291961895066365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5422885572139303,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.5422885572139303,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514492,
"mc2": 0.46362057203759177,
"mc2_stderr": 0.015792473170958256
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822999
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225212
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-one-sec-cv12-each-chunk-uniq/chunk_54 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1332826588.0
num_examples: 259709
download_size: 1366988795
dataset_size: 1332826588.0
---
# Dataset Card for "chunk_54"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
julia-wenkmann/TennisSegmentation | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1159771633.0
num_examples: 197
download_size: 77225573
dataset_size: 1159771633.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
- **Category Name:** `object`
- **ID:** `1`
- **Color:** RGB(0, 113, 188)
- **Category Name:** `ball`
- **ID:** `2`
- **Color:** RGB(216, 82, 24)
- **Category Name:** `playerTop`
- **ID:** `3`
- **Color:** RGB(236, 176, 31)
- **Category Name:** `playerBottom`
- **ID:** `4`
- **Color:** RGB(125, 46, 141)
- **Category Name:** `court`
- **ID:** `5`
- **Color:** RGB(118, 171, 47)
|
Birchlabs/openai-prm800k-phase2_test-stepwise-critique | ---
license: mit
---
|
n3rd0/VSI_biology_QA | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1341870
num_examples: 4500
- name: test
num_bytes: 242060
num_examples: 794
download_size: 518043
dataset_size: 1583930
---
# Dataset Card for "VSI_biology_QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AshanGimhana/testing001 | ---
license: mit
---
|
germank/hh-generated_flan_t5_large_with_features2_flan_t5_large | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: 'biased:'
dtype: int64
- name: easy-to-understand
dtype: int64
- name: enough-detail
dtype: int64
- name: factuality
dtype: int64
- name: fail-to-consider-context
dtype: int64
- name: fail-to-consider-individual-preferences
dtype: int64
- name: helpfulness
dtype: int64
- name: intent
dtype: int64
- name: readability
dtype: int64
- name: relevance
dtype: int64
- name: repetetive
dtype: int64
- name: specificity
dtype: int64
- name: too-long
dtype: int64
- name: log_score
dtype: float64
splits:
- name: train
num_bytes: 408123
num_examples: 1600
download_size: 87285
dataset_size: 408123
---
# Dataset Card for "hh-generated_flan_t5_large_with_features2_flan_t5_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reza-alipour/discofuse_1percent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: connective_string
dtype: string
- name: discourse_type
dtype: string
- name: coherent_second_sentence
dtype: string
- name: has_coref_type_pronoun
dtype: float32
- name: incoherent_first_sentence
dtype: string
- name: incoherent_second_sentence
dtype: string
- name: has_coref_type_nominal
dtype: float32
- name: coherent_first_sentence
dtype: string
splits:
- name: train
num_bytes: 63778517.906742156
num_examples: 163105
download_size: 42876638
dataset_size: 63778517.906742156
---
# Dataset Card for "discofuse_1percent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_39_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8221814
num_examples: 19165
download_size: 4380540
dataset_size: 8221814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_39_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tdh87/STORYTELLERV100 | ---
license: apache-2.0
---
|
Glavin001/startup-interviews | ---
license: cc-by-nc-2.0
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- n<1K
--- |
zelalt/great_papers_augmentation | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2663516.5807451624
num_examples: 1731
download_size: 1670568
dataset_size: 2663516.5807451624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
psroy/mini-platypus-guanaco-one-400 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 543862
num_examples: 400
download_size: 328169
dataset_size: 543862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BitBasher/mini-dataset-978 | ---
dataset_info:
features:
- name: expected_output
dtype: string
- name: instruction
dtype: string
- name: input_content
dtype: string
splits:
- name: train
num_bytes: 825340
num_examples: 978
download_size: 229603
dataset_size: 825340
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_YeungNLP__firefly-llama-13b | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama-13b](https://huggingface.co/YeungNLP/firefly-llama-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T14:09:32.562166](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama-13b/blob/main/results_2023-10-15T14-09-32.562166.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12720218120805368,\n\
\ \"em_stderr\": 0.003412272897129365,\n \"f1\": 0.22689702181208146,\n\
\ \"f1_stderr\": 0.003654793246945971,\n \"acc\": 0.418998511823731,\n\
\ \"acc_stderr\": 0.009810630903312568\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12720218120805368,\n \"em_stderr\": 0.003412272897129365,\n\
\ \"f1\": 0.22689702181208146,\n \"f1_stderr\": 0.003654793246945971\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \
\ \"acc_stderr\": 0.0075523385277169374\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908197\n\
\ }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T14_09_32.562166
path:
- '**/details_harness|drop|3_2023-10-15T14-09-32.562166.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T14-09-32.562166.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T14_09_32.562166
path:
- '**/details_harness|gsm8k|5_2023-10-15T14-09-32.562166.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T14-09-32.562166.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:51:43.691477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:51:43.691477.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:51:43.691477.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T14_09_32.562166
path:
- '**/details_harness|winogrande|5_2023-10-15T14-09-32.562166.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T14-09-32.562166.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_51_43.691477
path:
- results_2023-07-19T18:51:43.691477.parquet
- split: 2023_10_15T14_09_32.562166
path:
- results_2023-10-15T14-09-32.562166.parquet
- split: latest
path:
- results_2023-10-15T14-09-32.562166.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama-13b](https://huggingface.co/YeungNLP/firefly-llama-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T14:09:32.562166](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama-13b/blob/main/results_2023-10-15T14-09-32.562166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12720218120805368,
"em_stderr": 0.003412272897129365,
"f1": 0.22689702181208146,
"f1_stderr": 0.003654793246945971,
"acc": 0.418998511823731,
"acc_stderr": 0.009810630903312568
},
"harness|drop|3": {
"em": 0.12720218120805368,
"em_stderr": 0.003412272897129365,
"f1": 0.22689702181208146,
"f1_stderr": 0.003654793246945971
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.0075523385277169374
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908197
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tejas21private/ai2d_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 238194265
num_examples: 4900
download_size: 238194265
dataset_size: 238194265
---
# Dataset Card for "ai2d dataset"
|
candelaz/Kevin | ---
license: apache-2.0
---
|
harpreetsahota/five_shot_comparison | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
- name: decilm_generation
dtype: string
- name: mistral_generation
dtype: string
- name: falcon_generation
dtype: string
splits:
- name: train
num_bytes: 120687
num_examples: 30
download_size: 60550
dataset_size: 120687
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "five_shot_comparison"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilabel-internal-testing/Capybara-Preferences-Remaining | ---
size_categories: n<1K
tags:
- synthetic
- distilabel
- rlaif
dataset_info:
config_name: keep_columns
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generations
sequence: string
- name: generation_models
sequence: string
- name: ratings
sequence: int64
- name: rationales
sequence: string
- name: ultrafeedback_model
dtype: string
splits:
- name: train
num_bytes: 54020454
num_examples: 8375
download_size: 27568892
dataset_size: 54020454
configs:
- config_name: keep_columns
data_files:
- split: train
path: keep_columns/train-*
---
<p align="left">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
# Dataset Card for Capybara-Preferences-Remaining
This dataset has been created with [Distilabel](https://distilabel.argilla.io/).
## Dataset Summary
This dataset contains a `pipeline.yaml` which can be used to reproduce the pipeline that generated it in distilabel using the `distilabel` CLI:
```console
distilabel pipeline run --config "https://huggingface.co/datasets/distilabel-internal-testing/Capybara-Preferences-Remaining/raw/main/pipeline.yaml"
```
or explore the configuration:
```console
distilabel pipeline info --config "https://huggingface.co/datasets/distilabel-internal-testing/Capybara-Preferences-Remaining/raw/main/pipeline.yaml"
```
## Dataset structure
The examples have the following structure per configuration:
<details><summary> Configuration: keep_columns </summary><hr>
```json
{
"generation_models": [
"teknium/OpenHermes-2.5-Mistral-7B",
"argilla/notus-7b-v1",
"mlabonne/NeuralBeagle14-7B",
"ask_ldj"
],
"generations": [
"Certainly, there are several strategies that can help prevent the creation of a hostile environment during a debate or discussion:\n\n1. Encourage Respectful Communication: Set a tone of respect and promote civil discourse by emphasizing the importance of listening, asking questions, and acknowledging different perspectives.\n\n2. Establish Clear Rules: Clearly outline the ground rules for the discussion or debate, emphasizing that personal attacks or ad hominem remarks will not be tolerated.\n\n3. Promote Active Listening: Encourage participants to actively listen to each other, rather than just waiting for",
"Certainly! Here are some strategies that can help prevent the creation of a hostile environment during a debate or discussion:\n\n1. Encourage a Safe and Respectful Space: Create an environment that makes participants feel safe and respected by reminding everyone to avoid ad hominem attacks, to listen to each other, and to respect each other\u0027s viewpoints. Establishing clear ground rules can help everyone focus on the topic at hand and avoid distractions. 2. Encourage Active Listening: Encourage participants to listen actively to each other by letting everyone know that they have been heard,",
"Here are some strategies to prevent the creation of a hostile environment during a debate or discussion:\n\n1. Establish ground rules: At the beginning of a debate or discussion, establish clear ground rules or guidelines that promote civility and respect among participants. This may include prohibiting personal attacks, ad hominem arguments, interruptions, or negative body language.\n\n2. Encourage active listening: Encourage all participants to listen actively and attentively to each other\u0027s perspectives. This promotes understanding and can help reduce misunderstandings or emotional escalation.\n\n3. Focus on the issue",
"Absolutely, here are some strategies to prevent the creation of a hostile environment during a debate or discussion:\n\n1. Establish Ground Rules: Before starting the discussion, set clear expectations for respectful and constructive dialogue. This might include rules against personal attacks, interruptions, or disrespectful language.\n\n2. Encourage Active Listening: Participants should be encouraged to listen to others\u0027 viewpoints with an open mind, rather than formulating their own responses while others are speaking.\n\n3. Promote Empathy: Encourage participants to try to understand the perspectives and feelings of others, even if they disagree. This can help to foster a sense of mutual respect and understanding.\n\n4. Focus on the Issue, Not the Person: Remind participants to focus on the topic at hand, rather than resorting to personal attacks or criticisms. \n\n5. Use Neutral Language: Avoid inflammatory or loaded language that could escalate tensions. Instead, use neutral, objective language to express your points.\n\n6. Facilitate Fairness: If you\u0027re leading the discussion, ensure that all participants have an equal opportunity to speak and that their viewpoints are given fair consideration.\n\n7. Encourage Questions: Encourage participants to ask clarifying questions if they don\u0027t understand a point, rather than making assumptions or jumping to conclusions.\n\n8. Model Respectful Behavior: If you\u0027re leading the discussion, model the kind of behavior you want to see from participants. This includes listening attentively, responding respectfully, and admitting when you\u0027re wrong.\n\n9. Manage Conflict: If tensions do arise, address them directly and calmly. This might involve taking a short break, reminding participants of the ground rules, or facilitating a resolution between conflicting parties.\n\n10. Provide Positive Feedback: Recognize and reinforce positive behaviors, such as when participants express their points respectfully, listen attentively, or show empathy towards others. This can help to create a more positive and collaborative environment."
],
"messages": [
{
"content": "Identify the type of logical fallacy in this argument: \u0027You can\u0027t trust Peter because he didn\u0027t go to college.\u0027\n",
"role": "user"
},
{
"content": "The logical fallacy in this argument is ad hominem, because it attacks Peter\u0027s personal characteristics or circumstances (not going to college) rather than addressing the merits of his arguments or ideas.",
"role": "assistant"
},
{
"content": "Given that ad hominem is a type of logical fallacy, how might this bias affect the quality of a debate or discussion?",
"role": "user"
},
{
"content": "Ad hominem fallacies can significantly lower the quality of a debate or discussion. Instead of focusing on the merits of the arguments, ideas, or evidence at hand, participants might get distracted by personal attacks or irrelevant criticisms. This can lead to a hostile environment, where the focus is on winning the argument rather than seeking truth or understanding. Furthermore, it can prevent a fair and balanced evaluation of the actual topic being discussed, as it diverts attention away from the substantive issues.",
"role": "assistant"
},
{
"content": "Discuss the potential consequences of a hostile environment in a debate or discussion.",
"role": "user"
},
{
"content": "A hostile environment in a debate or discussion can have several negative consequences:\n\n1. Deterioration of Communication: When the environment becomes hostile, it can lead to breakdowns in communication. Participants may become defensive or aggressive, which can hinder effective exchange of ideas.\n\n2. Reduced Participation: Some individuals may feel intimidated or uncomfortable in a hostile environment, leading them to withdraw or refrain from contributing their ideas or perspectives.\n\n3. Bias and Prejudice: Hostility can exacerbate biases and prejudices, as participants may resort to stereotypes or discriminatory remarks, further escalating the conflict.\n\n4. Loss of Objectivity: In a hostile environment, participants may lose sight of the objective of the discussion or debate. Instead of seeking to understand different viewpoints or find common ground, they may focus on winning the argument at all costs.\n\n5. Damage to Relationships: If the debate or discussion involves individuals who have ongoing relationships (like colleagues, friends, or family members), a hostile environment can cause long-term damage to these relationships.\n\n6. Misinformation: In a hostile debate, the focus often shifts from facts and logical reasoning to personal attacks. This can lead to the spread of misinformation, as participants may overlook or dismiss valid points or evidence.\n\nOverall, a hostile environment in a debate or discussion can inhibit productive dialogue, discourage participation, and potentially lead to misinformation or misunderstanding.",
"role": "assistant"
},
{
"content": "Could you elaborate on strategies to prevent the creation of a hostile environment during a debate or discussion?",
"role": "user"
}
],
"ratings": [
4,
4
],
"rationales": [
"The text provides correct and informative strategies to prevent hostility, expressing a clear understanding of the set topics. It does not appear to include any dishonesty, uncertainty, or hallucination; however, the response is cut off, hence not fully complete. This affects the instruction following slightly as the list of strategies is incomplete.",
"This text also offers valuable strategies for maintaining a non-hostile environment, aligning well with the user\u0027s request. The strategies are articulated clearly and confidently. However, the output"
],
"ultrafeedback_model": "gpt-4-1106-preview"
}
```
This subset can be loaded as:
```python
from datasets import load_dataset
ds = load_dataset("distilabel-internal-testing/Capybara-Preferences-Remaining", "keep_columns")
```
</details>
|
spdenisov/processed | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 548644785
num_examples: 191025
download_size: 80309791
dataset_size: 548644785
---
# Dataset Card for "processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisson40889/HOLK | ---
license: openrail
---
|
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-64-nsample-2_random | ---
dataset_info:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25669546
num_examples: 20001
download_size: 12196543
dataset_size: 25669546
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25645562
num_examples: 20001
download_size: 12148697
dataset_size: 25645562
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25489159
num_examples: 20001
download_size: 11795765
dataset_size: 25489159
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
---
|
FarmerlineML/igbo_dataset_2 | ---
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 162930082.12
num_examples: 4892
- name: test
num_bytes: 28447287.0
num_examples: 802
download_size: 185263836
dataset_size: 191377369.12
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nz/ulma_anthropic_hh_golden | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 37152501
num_examples: 42537
- name: test
num_bytes: 2071585
num_examples: 2312
download_size: 22809621
dataset_size: 39224086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Damoniano/Agent1 | ---
license: creativeml-openrail-m
---
|
davidfant/natural-questions-chunk-11 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4676417206
num_examples: 10000
download_size: 1813929282
dataset_size: 4676417206
---
# Dataset Card for "natural-questions-chunk-11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rafasds/Alannovo | ---
license: openrail
---
|
eduardem/parrot_en_es_sample | ---
license: openrail
---
# Parrot_en_es Fine-Tuning Dataset
## Introduction
This dataset serves as a sample for fine-tuning the `parrot_en_es` translation model. The data is structured in the OpenOrca format and is designed to train the model to perform specific tasks related to translation.
## Objective
The primary aim of this dataset is to achieve alignment in the model so that it acts strictly as a translator. The model should output solely the translation, avoiding any additional comments, considerations, extra content, fixes, or refusals.
## Format
### OpenOrca
The dataset employs the OpenOrca format and maintains a consistent system prompt across all samples. This uniformity in system prompts is intended to guide the model towards the desired behavior, which is to produce only the translation.
## Usage
This dataset can be used to:
- Fine-tune the `parrot_en_es` translation model
- Validate the model's ability to output clean translations without any superfluous content
- Assess the effectiveness of using a consistent system prompt to guide model behavior
## Contributing
For suggestions to improve or expand the dataset, feel free to open an issue or submit a pull request.
|
irds/mr-tydi_ja | ---
pretty_name: '`mr-tydi/ja`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/ja`
The `mr-tydi/ja` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ja).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=7,000,027
- `queries` (i.e., topics); count=5,353
- `qrels`: (relevance assessments); count=5,548
This dataset is used by: [`mr-tydi_ja_dev`](https://huggingface.co/datasets/irds/mr-tydi_ja_dev), [`mr-tydi_ja_test`](https://huggingface.co/datasets/irds/mr-tydi_ja_test), [`mr-tydi_ja_train`](https://huggingface.co/datasets/irds/mr-tydi_ja_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mr-tydi_ja', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/mr-tydi_ja', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_ja', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
hungsmoie/khoaiphim | ---
license: unlicense
---
|
Farisya/usermanual-dataset | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 18543
num_examples: 22
- name: test
num_bytes: 8671
num_examples: 9
download_size: 20134
dataset_size: 27214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k | ---
pretty_name: Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2927085297989137,\n\
\ \"acc_stderr\": 0.03275517148401362,\n \"acc_norm\": 0.29622047886660385,\n\
\ \"acc_norm_stderr\": 0.032746678820457335,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n\
\ \"mc2_stderr\": 0.014198737236851828\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46928327645051193,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.4974402730375427,\n \"acc_norm_stderr\": 0.014611199329843777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5497908783110934,\n\
\ \"acc_stderr\": 0.004964979120927565,\n \"acc_norm\": 0.7288388767177854,\n\
\ \"acc_norm_stderr\": 0.004436505187567003\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.02688064788905197,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.02688064788905197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961478,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961478\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417392,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417392\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869662,\n \"\
acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869662\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3881856540084388,\n \"acc_stderr\": 0.0317229500433233,\n \
\ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.0317229500433233\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n\
\ \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n\
\ \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2988505747126437,\n\
\ \"acc_stderr\": 0.01636925681509314,\n \"acc_norm\": 0.2988505747126437,\n\
\ \"acc_norm_stderr\": 0.01636925681509314\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257796,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257796\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615686,\n\
\ \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n\
\ \"acc_stderr\": 0.011559337355708502,\n \"acc_norm\": 0.28748370273794005,\n\
\ \"acc_norm_stderr\": 0.011559337355708502\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318075,\n \
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.03155782816556165,\n\
\ \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.03155782816556165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.34328358208955223,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.34328358208955223,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.39766081871345027,\n \"acc_stderr\": 0.0375363895576169,\n\
\ \"acc_norm\": 0.39766081871345027,\n \"acc_norm_stderr\": 0.0375363895576169\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n\
\ \"mc2_stderr\": 0.014198737236851828\n }\n}\n```"
repo_url: https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet'
- config_name: results
data_files:
- split: 2023_09_13T10_12_42.075501
path:
- results_2023-09-13T10-12-42.075501.parquet
- split: latest
path:
- results_2023-09-13T10-12-42.075501.parquet
---
# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2927085297989137,
"acc_stderr": 0.03275517148401362,
"acc_norm": 0.29622047886660385,
"acc_norm_stderr": 0.032746678820457335,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
},
"harness|arc:challenge|25": {
"acc": 0.46928327645051193,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.4974402730375427,
"acc_norm_stderr": 0.014611199329843777
},
"harness|hellaswag|10": {
"acc": 0.5497908783110934,
"acc_stderr": 0.004964979120927565,
"acc_norm": 0.7288388767177854,
"acc_norm_stderr": 0.004436505187567003
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.02688064788905197,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.02688064788905197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749895,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749895
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961478,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961478
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417392,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417392
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869662,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.0317229500433233,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.0317229500433233
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2988505747126437,
"acc_stderr": 0.01636925681509314,
"acc_norm": 0.2988505747126437,
"acc_norm_stderr": 0.01636925681509314
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615686,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708502,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708502
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318075,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.34328358208955223,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.34328358208955223,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.39766081871345027,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.39766081871345027,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_klyang__MentaLLaMA-chat-7B | ---
pretty_name: Evaluation run of klyang/MentaLLaMA-chat-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [klyang/MentaLLaMA-chat-7B](https://huggingface.co/klyang/MentaLLaMA-chat-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klyang__MentaLLaMA-chat-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T22:04:51.845916](https://huggingface.co/datasets/open-llm-leaderboard/details_klyang__MentaLLaMA-chat-7B/blob/main/results_2023-12-02T22-04-51.845916.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.16148597422289612,\n\
\ \"acc_stderr\": 0.010135959452134306\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.16148597422289612,\n \"acc_stderr\": 0.010135959452134306\n\
\ }\n}\n```"
repo_url: https://huggingface.co/klyang/MentaLLaMA-chat-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T22_04_51.845916
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-04-51.845916.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-04-51.845916.parquet'
- config_name: results
data_files:
- split: 2023_12_02T22_04_51.845916
path:
- results_2023-12-02T22-04-51.845916.parquet
- split: latest
path:
- results_2023-12-02T22-04-51.845916.parquet
---
# Dataset Card for Evaluation run of klyang/MentaLLaMA-chat-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/klyang/MentaLLaMA-chat-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [klyang/MentaLLaMA-chat-7B](https://huggingface.co/klyang/MentaLLaMA-chat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_klyang__MentaLLaMA-chat-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T22:04:51.845916](https://huggingface.co/datasets/open-llm-leaderboard/details_klyang__MentaLLaMA-chat-7B/blob/main/results_2023-12-02T22-04-51.845916.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.16148597422289612,
"acc_stderr": 0.010135959452134306
},
"harness|gsm8k|5": {
"acc": 0.16148597422289612,
"acc_stderr": 0.010135959452134306
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
edouard-rolland/volcanic-plumes | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
- Volcanoes
- Plumes
- UAVs
- Drone
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': plume
'1': summit
splits:
- name: train
num_bytes: 29846342.127
num_examples: 1211
- name: validation
num_bytes: 7311174
num_examples: 294
- name: test
num_bytes: 12048406
num_examples: 456
download_size: 49324639
dataset_size: 49205922.127000004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: mit
language:
- en
---
<div align="center">
<img width=100% alt="edouard-rolland/volcanic-plumes" src="https://huggingface.co/datasets/edouard-rolland/volcanic-plumes/resolve/main/thumbnail.png">
</div>
# Dataset Description
The dataset presents labelled pictures of plumes and of the Fuego Summit in Guatemala. The data was collected by the University of Bristol Flight Lab in Guatemala from March 22 to April 3, 2019. The drone used for this purpose was a Skywalker X8, equipped with a Pixhawk onboard computer running ArduPlane 3.7.1 and a Raspberry Pi 3B+ for mission management and communication with the ground station. The drone was also equipped of a GoPro Hero 9.
# Citation
```
@inproceedings{rolland2024volcanic,
author = {Edouard G. A. Rolland and Kasper A. R. Grøntved and Anders Lyhne Christensen and Matthew Watson and Tom Richardson},
title = { Autonomous {UAV} Volcanic Plume Sampling Based on Machine Vision and Path Planning},
year = { 2024 },
note = {Under review},
}
```
# Acknowledgement
This work is supported by the WildDrone MSCA Doctoral Network funded by EU Horizon Europe under grant agreement no. 101071224, the Innovation Fund Denmark for the project DIREC (9142-00001B), and by the Engineering & Physical Sciences Research Council (UK) through the CASCADE (Complex Autonomous aircraft Systems Configuration, Analysis and Design Exploratory) programme grant (EP/R009953/1).
# Dataset Labels
```
['plume', 'summit']
```
# Example of Labelled Images
<div align="center">
<<img width=50% alt="edouard-rolland/volcanic-plumes" src="https://huggingface.co/datasets/edouard-rolland/volcanic-plumes/resolve/main/val_batch0_labels.jpg">
</div>
# Number of Images
```json
{'valid': 294, 'test': 456, 'train': 1211}
```
# Example of Application
The dataset was used to train a YOLOv8 neural network. More details can be found in the paper mentioned in the citation section. The following <a href="https://www.youtube.com/watch?v=pSGYUPancfA">video</a> presents the model output for an entire flight.
# How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("edouard-rolland/volcanic-plumes", name="full")
example = ds['train'][0]
```
# License
MIT |
Cohere/miracl-sw-queries-22-12 | ---
annotations_creators:
- expert-generated
language:
- sw
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (sw) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-sw-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-sw-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-sw-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-sw-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-sw-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-sw-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-sw-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-sw-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-sw-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-sw-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-sw-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-sw-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
open-llm-leaderboard/details_azarafrooz__gemma-2b-it-nlai-v0 | ---
pretty_name: Evaluation run of azarafrooz/gemma-2b-it-nlai-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/gemma-2b-it-nlai-v0](https://huggingface.co/azarafrooz/gemma-2b-it-nlai-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__gemma-2b-it-nlai-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T12:20:29.148062](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__gemma-2b-it-nlai-v0/blob/main/results_2024-04-15T12-20-29.148062.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37660169444969677,\n\
\ \"acc_stderr\": 0.03380673348549725,\n \"acc_norm\": 0.3811167415748182,\n\
\ \"acc_norm_stderr\": 0.03461538621374684,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.015866346401384308,\n \"mc2\": 0.45797553398869045,\n\
\ \"mc2_stderr\": 0.015928281785783458\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4052901023890785,\n \"acc_stderr\": 0.014346869060229327,\n\
\ \"acc_norm\": 0.4402730375426621,\n \"acc_norm_stderr\": 0.014506769524804246\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48137821151165106,\n\
\ \"acc_stderr\": 0.004986319587524964,\n \"acc_norm\": 0.6266679944234216,\n\
\ \"acc_norm_stderr\": 0.004827006520802888\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n\
\ \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3193548387096774,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.3193548387096774,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03888176921674098,\n\
\ \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03888176921674098\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.03602573571288442,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.03602573571288442\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.023854795680971135,\n\
\ \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.023854795680971135\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.024388430433987664,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.024388430433987664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5027522935779817,\n \"acc_stderr\": 0.021436998359765317,\n \"\
acc_norm\": 0.5027522935779817,\n \"acc_norm_stderr\": 0.021436998359765317\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690218,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690218\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03460228327239171,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03460228327239171\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4648786717752235,\n\
\ \"acc_stderr\": 0.017835798806290642,\n \"acc_norm\": 0.4648786717752235,\n\
\ \"acc_norm_stderr\": 0.017835798806290642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n\
\ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225624,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225624\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088003,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088003\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40514469453376206,\n\
\ \"acc_stderr\": 0.02788238379132596,\n \"acc_norm\": 0.40514469453376206,\n\
\ \"acc_norm_stderr\": 0.02788238379132596\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.027402042040269952,\n\
\ \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.027402042040269952\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022135,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022135\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31747066492829207,\n\
\ \"acc_stderr\": 0.011888892068809312,\n \"acc_norm\": 0.31747066492829207,\n\
\ \"acc_norm_stderr\": 0.011888892068809312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.37745098039215685,\n \"acc_stderr\": 0.019610851474880286,\n \
\ \"acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.019610851474880286\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46938775510204084,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.46938775510204084,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4427860696517413,\n\
\ \"acc_stderr\": 0.03512310964123936,\n \"acc_norm\": 0.4427860696517413,\n\
\ \"acc_norm_stderr\": 0.03512310964123936\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4327485380116959,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.4327485380116959,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.015866346401384308,\n \"mc2\": 0.45797553398869045,\n\
\ \"mc2_stderr\": 0.015928281785783458\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6132596685082873,\n \"acc_stderr\": 0.013687214761883042\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.006216328640238136\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/gemma-2b-it-nlai-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|arc:challenge|25_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|gsm8k|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hellaswag|10_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-20-29.148062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T12-20-29.148062.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- '**/details_harness|winogrande|5_2024-04-15T12-20-29.148062.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T12-20-29.148062.parquet'
- config_name: results
data_files:
- split: 2024_04_15T12_20_29.148062
path:
- results_2024-04-15T12-20-29.148062.parquet
- split: latest
path:
- results_2024-04-15T12-20-29.148062.parquet
---
# Dataset Card for Evaluation run of azarafrooz/gemma-2b-it-nlai-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/gemma-2b-it-nlai-v0](https://huggingface.co/azarafrooz/gemma-2b-it-nlai-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__gemma-2b-it-nlai-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T12:20:29.148062](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__gemma-2b-it-nlai-v0/blob/main/results_2024-04-15T12-20-29.148062.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37660169444969677,
"acc_stderr": 0.03380673348549725,
"acc_norm": 0.3811167415748182,
"acc_norm_stderr": 0.03461538621374684,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.015866346401384308,
"mc2": 0.45797553398869045,
"mc2_stderr": 0.015928281785783458
},
"harness|arc:challenge|25": {
"acc": 0.4052901023890785,
"acc_stderr": 0.014346869060229327,
"acc_norm": 0.4402730375426621,
"acc_norm_stderr": 0.014506769524804246
},
"harness|hellaswag|10": {
"acc": 0.48137821151165106,
"acc_stderr": 0.004986319587524964,
"acc_norm": 0.6266679944234216,
"acc_norm_stderr": 0.004827006520802888
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3193548387096774,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.3193548387096774,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03888176921674098,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03888176921674098
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.03602573571288442,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.03602573571288442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.023854795680971135,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.023854795680971135
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.024388430433987664,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024388430433987664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5027522935779817,
"acc_stderr": 0.021436998359765317,
"acc_norm": 0.5027522935779817,
"acc_norm_stderr": 0.021436998359765317
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690218,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690218
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4648786717752235,
"acc_stderr": 0.017835798806290642,
"acc_norm": 0.4648786717752235,
"acc_norm_stderr": 0.017835798806290642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225624,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225624
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088003,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088003
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40514469453376206,
"acc_stderr": 0.02788238379132596,
"acc_norm": 0.40514469453376206,
"acc_norm_stderr": 0.02788238379132596
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41358024691358025,
"acc_stderr": 0.027402042040269952,
"acc_norm": 0.41358024691358025,
"acc_norm_stderr": 0.027402042040269952
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022135,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022135
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31747066492829207,
"acc_stderr": 0.011888892068809312,
"acc_norm": 0.31747066492829207,
"acc_norm_stderr": 0.011888892068809312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46938775510204084,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.46938775510204084,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4427860696517413,
"acc_stderr": 0.03512310964123936,
"acc_norm": 0.4427860696517413,
"acc_norm_stderr": 0.03512310964123936
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4327485380116959,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.4327485380116959,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.015866346401384308,
"mc2": 0.45797553398869045,
"mc2_stderr": 0.015928281785783458
},
"harness|winogrande|5": {
"acc": 0.6132596685082873,
"acc_stderr": 0.013687214761883042
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.006216328640238136
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_55 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1315757524.0
num_examples: 258397
download_size: 1345022572
dataset_size: 1315757524.0
---
# Dataset Card for "chunk_55"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hypno-ai/margiela-style | ---
license: apache-2.0
---
|
mrzbrt/test_audioset | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1267533455.0
num_examples: 401
download_size: 1227173992
dataset_size: 1267533455.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo16_2_64_mix_50_kl_0.1_prm_410m_thr_1.0_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43744390
num_examples: 18928
- name: epoch_1
num_bytes: 43853451
num_examples: 18928
- name: epoch_2
num_bytes: 43703566
num_examples: 18928
- name: epoch_3
num_bytes: 43575131
num_examples: 18928
- name: epoch_4
num_bytes: 43518590
num_examples: 18928
- name: epoch_5
num_bytes: 43495671
num_examples: 18928
- name: epoch_6
num_bytes: 43474824
num_examples: 18928
- name: epoch_7
num_bytes: 43481960
num_examples: 18928
- name: epoch_8
num_bytes: 43468743
num_examples: 18928
- name: epoch_9
num_bytes: 43462697
num_examples: 18928
download_size: 278955072
dataset_size: 435779023
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
tanmay2798/codedataset | ---
license: llama2
---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- expert-generated
license:
- other
multilinguality:
- monolingual
pretty_name: codedataset
size_categories:
- n<1K
source_datasets:
- original
tags: []
task_categories:
- text2text-generation
- text-generation
- question-answering
task_ids:
- explanation-generation
- open-book-qa
- closed-book-qa
- abstractive-qa
- language-modeling
- dialogue-modeling
- extractive-qa
|
LorenzH/juliet_test_suite_c_1_3 | ---
license: cc0-1.0
task_categories:
- text-classification
pretty_name: Juliet Test Suite 1.3
size_categories:
- 10K<n<100K
---
# Dataset Card for the Juliet Test Suite 1.3
### Dataset Summary
This Datasets contains all test cases from the NIST's [Juliet test suite](https://samate.nist.gov/SARD/test-suites/112) for the C and C++ programming languages. The dataset contains a benign and a defective implementation of each sample, which have been extracting by means of the OMITGOOD and OMITBAD preprocessor macros of the Juliet test suite.
### Supported Tasks and Leaderboards
Software defect prediction, code clone detection.
### Languages
The C and C++ programming languages.
## Dataset Structure
### Data Instances
### Data Fields
| index | name | type | description |
| --- | --- | --- | --- |
| 0 | index | int | The index of each sample in the dataset. |
| 1 | filename | str | The path to the test case including the file name. |
| 2 | class | int | The class of the defect, i.e., the collection by CWE number from which the sample was taken. |
| 3 | good | str | The code of the benign implementation. |
| 4 | bad | str | The code of the defective implementation. |
### Data Splits
| type | size |
|------|------|
| train | 80706 cases |
| test | 20177 cases |
## Dataset Creation
### Curation Rationale
### Source Data
https://samate.nist.gov/SARD/test-suites/112
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
The Juliet test suite is a synthetic dataset, meaning that all samples have been manually crafted. Therefore they are not entirely representative of actual software defects found in the wild. A classifier trained on these samples may suffer from decreased predictive performance, leading to gross misclassifactions. Critical software defects may therefore be overlooked, when such model is applied in a realistic environment.
## Additional Information
### Dataset Curators
https://github.com/lorenz9314/
### Licensing Information
### Citation Information
### Contributions
|
open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu | ---
pretty_name: Evaluation run of Charlie911/MultiLoRA-mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/MultiLoRA-mmlu](https://huggingface.co/Charlie911/MultiLoRA-mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T00:16:37.919745](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu/blob/main/results_2024-02-02T00-16-37.919745.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49682393519091933,\n\
\ \"acc_stderr\": 0.034234549326781244,\n \"acc_norm\": 0.5024513039750834,\n\
\ \"acc_norm_stderr\": 0.03499839555197597,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.5020042655317223,\n\
\ \"mc2_stderr\": 0.015409102519026984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244077,\n\
\ \"acc_norm\": 0.5238907849829352,\n \"acc_norm_stderr\": 0.014594701798071654\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5803624775941048,\n\
\ \"acc_stderr\": 0.004924910433106353,\n \"acc_norm\": 0.7720573590918144,\n\
\ \"acc_norm_stderr\": 0.0041864806453155625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730578,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730578\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n\
\ \"acc_stderr\": 0.02837228779796293,\n \"acc_norm\": 0.535483870967742,\n\
\ \"acc_norm_stderr\": 0.02837228779796293\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933893,\n\
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933893\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n\
\ \"acc_stderr\": 0.02004811592341531,\n \"acc_norm\": 0.6770642201834862,\n\
\ \"acc_norm_stderr\": 0.02004811592341531\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n\
\ \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.032282103870378935,\n \"\
acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.669220945083014,\n\
\ \"acc_stderr\": 0.01682481846256376,\n \"acc_norm\": 0.669220945083014,\n\
\ \"acc_norm_stderr\": 0.01682481846256376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.02680372058320617,\n\
\ \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.02680372058320617\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528784,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528784\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668767,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902887,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902887\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545484,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545484\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348644,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348644\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.5020042655317223,\n\
\ \"mc2_stderr\": 0.015409102519026984\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871596\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \
\ \"acc_stderr\": 0.010058474790238962\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/MultiLoRA-mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-37.919745.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- '**/details_harness|winogrande|5_2024-02-02T00-16-37.919745.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T00-16-37.919745.parquet'
- config_name: results
data_files:
- split: 2024_02_02T00_16_37.919745
path:
- results_2024-02-02T00-16-37.919745.parquet
- split: latest
path:
- results_2024-02-02T00-16-37.919745.parquet
---
# Dataset Card for Evaluation run of Charlie911/MultiLoRA-mmlu
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Charlie911/MultiLoRA-mmlu](https://huggingface.co/Charlie911/MultiLoRA-mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T00:16:37.919745](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu/blob/main/results_2024-02-02T00-16-37.919745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49682393519091933,
"acc_stderr": 0.034234549326781244,
"acc_norm": 0.5024513039750834,
"acc_norm_stderr": 0.03499839555197597,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.5020042655317223,
"mc2_stderr": 0.015409102519026984
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5238907849829352,
"acc_norm_stderr": 0.014594701798071654
},
"harness|hellaswag|10": {
"acc": 0.5803624775941048,
"acc_stderr": 0.004924910433106353,
"acc_norm": 0.7720573590918144,
"acc_norm_stderr": 0.0041864806453155625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730578,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730578
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.02837228779796293,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.02837228779796293
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933893,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.02004811592341531,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.02004811592341531
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378935,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.0276019213814176,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.0276019213814176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.669220945083014,
"acc_stderr": 0.01682481846256376,
"acc_norm": 0.669220945083014,
"acc_norm_stderr": 0.01682481846256376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.546242774566474,
"acc_stderr": 0.02680372058320617,
"acc_norm": 0.546242774566474,
"acc_norm_stderr": 0.02680372058320617
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528784,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528784
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668767,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902887,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902887
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545484,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545484
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348644,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348644
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.5020042655317223,
"mc2_stderr": 0.015409102519026984
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871596
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238962
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gemma111/customhkcode2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.