datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
umd-zhou-lab/sRecycled_Alpaca | ---
dataset_info:
features:
- name: data
struct:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 62380384
num_examples: 37114
download_size: 33868520
dataset_size: 62380384
---
# Dataset Card for "sRecycled_Alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/Clean-Instruct-3M | ---
language: en
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 1899089494.3291264
num_examples: 3085812
download_size: 1173097125
dataset_size: 1899089494.3291264
---
# Dataset Card for "Clean-Instruct-3M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/competition_math | ---
dataset_info:
features:
- name: problem
dtype: string
- name: level
dtype: string
- name: type
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 5984772
num_examples: 7500
download_size: 2992145
dataset_size: 5984772
---
# Dataset Card for "competition_math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yimingzhang/uf_safe_v1 | ---
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_prefs
num_bytes: 794558
num_examples: 1766
- name: test_prefs
num_bytes: 39816
num_examples: 86
download_size: 468541
dataset_size: 834374
---
# Dataset Card for "uf_safe_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/jeanne_d_arc_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jeanne_d_arc/ジャンヌ・ダルク/贞德 (Fate/Grand Order)
This is the dataset of jeanne_d_arc/ジャンヌ・ダルク/贞德 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, very_long_hair, large_breasts, braid, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 833.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 727.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1256 | 1.33 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jeanne_d_arc_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, armored_dress, gauntlets, headpiece, looking_at_viewer, solo, sword, faulds, chain, holding, sheath, smile, white_background, flag, simple_background, black_thighhighs, capelet, bare_shoulders, fur_trim |
| 1 | 9 |  |  |  |  |  | 1girl, armored_dress, bare_shoulders, chain, fur_trim, gauntlets, headpiece, solo, flag, looking_at_viewer, smile, closed_mouth, holding, faulds |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, chain, headpiece, looking_at_viewer, solo, upper_body, closed_mouth, fur_trim, gauntlets, smile, armored_dress |
| 3 | 8 |  |  |  |  |  | 1girl, capelet, headpiece, looking_at_viewer, solo, upper_body, chain, gauntlets, smile, armored_dress, blush, closed_mouth, white_background, simple_background |
| 4 | 6 |  |  |  |  |  | 1girl, headpiece, looking_at_viewer, smile, solo, white_dress, cleavage, long_sleeves, open_mouth, simple_background, white_background |
| 5 | 16 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, bare_shoulders, closed_mouth, black_bow, black_dress, see-through, black_gloves, black_thighhighs, blush, cleavage, choker, collarbone, hair_flower, medium_breasts, hair_bow, necklace, hair_between_eyes, ribbon, skirt_hold, side_slit, standing |
| 6 | 5 |  |  |  |  |  | 1girl, black_bikini, black_hairband, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, solo, black_scrunchie, blue_jacket, blue_sky, blush, cleavage, collarbone, day, hair_scrunchie, outdoors, cloud, hair_between_eyes, navel, ocean, open_mouth, thigh_strap, :d, ass_visible_through_thighs, closed_mouth, front-tie_bikini_top, holding, long_braid, sitting, standing, water |
| 7 | 9 |  |  |  |  |  | 1girl, floral_print, looking_at_viewer, obi, solo, headpiece, print_kimono, wide_sleeves, long_sleeves, smile, alternate_costume, blush, closed_mouth, yukata, blue_kimono, flower, hair_between_eyes, hair_ornament, holding_umbrella, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, necktie, sleeveless_shirt, smile, white_shirt, bare_shoulders, short_shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armored_dress | gauntlets | headpiece | looking_at_viewer | solo | sword | faulds | chain | holding | sheath | smile | white_background | flag | simple_background | black_thighhighs | capelet | bare_shoulders | fur_trim | closed_mouth | upper_body | blush | white_dress | cleavage | long_sleeves | open_mouth | black_bow | black_dress | see-through | black_gloves | choker | collarbone | hair_flower | medium_breasts | hair_bow | necklace | hair_between_eyes | ribbon | skirt_hold | side_slit | standing | black_bikini | black_hairband | hooded_jacket | open_jacket | black_scrunchie | blue_jacket | blue_sky | day | hair_scrunchie | outdoors | cloud | navel | ocean | thigh_strap | :d | ass_visible_through_thighs | front-tie_bikini_top | long_braid | sitting | water | floral_print | obi | print_kimono | wide_sleeves | alternate_costume | yukata | blue_kimono | flower | hair_ornament | holding_umbrella | necktie | sleeveless_shirt | white_shirt | short_shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:------------|:------------|:--------------------|:-------|:--------|:---------|:--------|:----------|:---------|:--------|:-------------------|:-------|:--------------------|:-------------------|:----------|:-----------------|:-----------|:---------------|:-------------|:--------|:--------------|:-----------|:---------------|:-------------|:------------|:--------------|:--------------|:---------------|:---------|:-------------|:--------------|:-----------------|:-----------|:-----------|:--------------------|:---------|:-------------|:------------|:-----------|:---------------|:-----------------|:----------------|:--------------|:------------------|:--------------|:-----------|:------|:-----------------|:-----------|:--------|:--------|:--------|:--------------|:-----|:-----------------------------|:-----------------------|:-------------|:----------|:--------|:---------------|:------|:---------------|:---------------|:--------------------|:---------|:--------------|:---------|:----------------|:-------------------|:----------|:-------------------|:--------------|:---------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | X | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | X | X | | | | | | X | X | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | | | X | X | | | | | | X | | | | X | | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | X | | | | X | | | | | | | | | | X | | X | | X | X | X | | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | X | X | X | | | | | | X | X | | | | | | | X | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | X | X | | | | | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO | ---
pretty_name: Evaluation run of macadeliccc/MBX-7B-v3-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T18:53:41.876317](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO/blob/main/results_2024-02-11T18-53-41.876317.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554435066939919,\n\
\ \"acc_stderr\": 0.03198817220538892,\n \"acc_norm\": 0.6546676568515765,\n\
\ \"acc_norm_stderr\": 0.03266175930986744,\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7399782698428227,\n\
\ \"mc2_stderr\": 0.014395363250478046\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n\
\ \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8910575582553276,\n\
\ \"acc_norm_stderr\": 0.003109302300176215\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055277,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055277\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7399782698428227,\n\
\ \"mc2_stderr\": 0.014395363250478046\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079232\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \
\ \"acc_stderr\": 0.012661502663418697\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/MBX-7B-v3-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|arc:challenge|25_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|gsm8k|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hellaswag|10_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T18-53-41.876317.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- '**/details_harness|winogrande|5_2024-02-11T18-53-41.876317.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T18-53-41.876317.parquet'
- config_name: results
data_files:
- split: 2024_02_11T18_53_41.876317
path:
- results_2024-02-11T18-53-41.876317.parquet
- split: latest
path:
- results_2024-02-11T18-53-41.876317.parquet
---
# Dataset Card for Evaluation run of macadeliccc/MBX-7B-v3-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T18:53:41.876317](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO/blob/main/results_2024-02-11T18-53-41.876317.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554435066939919,
"acc_stderr": 0.03198817220538892,
"acc_norm": 0.6546676568515765,
"acc_norm_stderr": 0.03266175930986744,
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.7399782698428227,
"mc2_stderr": 0.014395363250478046
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.8910575582553276,
"acc_norm_stderr": 0.003109302300176215
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055277,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.7399782698428227,
"mc2_stderr": 0.014395363250478046
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079232
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mathematicalmichael/city-transformers-data | ---
license: mit
---
|
AdapterOcean/code_instructions_standardized_cluster_0_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 18536042
num_examples: 7063
download_size: 10345348
dataset_size: 18536042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_0_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mskov/DaVinci_Completion | ---
license: cc
task_categories:
- text-generation
language:
- en
tags:
- conversational
pretty_name: DaVinComp
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bigscience-data/roots_es_wikivoyage | ---
language: es
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_es_wikivoyage
# wikivoyage_filtered
- Dataset uid: `wikivoyage_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0334 % of total
- 0.1097 % of en
- 0.0432 % of fr
- 0.0863 % of es
- 0.0084 % of zh
- 0.0892 % of vi
- 0.0464 % of indic-bn
- 0.0443 % of pt
- 0.0130 % of indic-hi
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
liuyanchen1015/MULTI_VALUE_qqp_that_resultative_past_participle | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2757
num_examples: 13
- name: test
num_bytes: 22348
num_examples: 120
- name: train
num_bytes: 23961
num_examples: 121
download_size: 39213
dataset_size: 49066
---
# Dataset Card for "MULTI_VALUE_qqp_that_resultative_past_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iandennismiller/llama-cpp-scripts | ---
language:
- code
pretty_name: "These are scripts that have helped me to manage llama.cpp, llama models, etc."
tags:
- bash
- llama.cpp
- script
license: mit
---
# llama.cpp scripts
These are scripts that have helped me to manage llama.cpp, llama models, etc.
## Install
Scripts are installed to `~/.local/bin`.
```bash
bash install.sh
```
|
shiviguptta/coco_canny | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: conditional_image_file
dtype: string
- name: text
dtype: string
- name: image
dtype: image
- name: conditional_image
dtype: image
splits:
- name: train
num_bytes: 15412639.0
num_examples: 306
download_size: 15400355
dataset_size: 15412639.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Back-up/html | ---
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 162502478.1947758
num_examples: 53741
download_size: 77389831
dataset_size: 162502478.1947758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "html"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamkaikai/amazing_logos_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 81856445.605
num_examples: 10123
download_size: 75482948
dataset_size: 81856445.605
---
# Dataset Card for "amazing_logos_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diwank/expertllama | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
- name: expert_identity
dtype: string
splits:
- name: train
num_bytes: 94172791
num_examples: 52002
download_size: 51939845
dataset_size: 94172791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "expertllama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/the-stack-dedup-markdown-deu_Latn | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: text_lang
dtype: string
- name: confidence
dtype: float64
splits:
- name: train
num_bytes: 562944473.2668307
num_examples: 127321
download_size: 452771983
dataset_size: 562944473.2668307
---
# Dataset Card for "the-stack-dedup-markdown-deu_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__SynthIA-v1.3-Nebula-v2-7B | ---
pretty_name: Evaluation run of Weyaxi/SynthIA-v1.3-Nebula-v2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/SynthIA-v1.3-Nebula-v2-7B](https://huggingface.co/Weyaxi/SynthIA-v1.3-Nebula-v2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SynthIA-v1.3-Nebula-v2-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T13:57:31.778075](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SynthIA-v1.3-Nebula-v2-7B/blob/main/results_2023-12-02T13-57-31.778075.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2486732373009856,\n\
\ \"acc_stderr\": 0.01190614722287996\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.2486732373009856,\n \"acc_stderr\": 0.01190614722287996\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/SynthIA-v1.3-Nebula-v2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T13_57_31.778075
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-57-31.778075.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-57-31.778075.parquet'
- config_name: results
data_files:
- split: 2023_12_02T13_57_31.778075
path:
- results_2023-12-02T13-57-31.778075.parquet
- split: latest
path:
- results_2023-12-02T13-57-31.778075.parquet
---
# Dataset Card for Evaluation run of Weyaxi/SynthIA-v1.3-Nebula-v2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/SynthIA-v1.3-Nebula-v2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/SynthIA-v1.3-Nebula-v2-7B](https://huggingface.co/Weyaxi/SynthIA-v1.3-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SynthIA-v1.3-Nebula-v2-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:57:31.778075](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SynthIA-v1.3-Nebula-v2-7B/blob/main/results_2023-12-02T13-57-31.778075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2486732373009856,
"acc_stderr": 0.01190614722287996
},
"harness|gsm8k|5": {
"acc": 0.2486732373009856,
"acc_stderr": 0.01190614722287996
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mrmllm/content | ---
license: apache-2.0
---
|
BangumiBase/guiltycrown | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Guilty Crown
This is the image base of bangumi Guilty Crown, we detected 30 characters, 2278 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 497 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 38 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 25 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 132 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 94 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 47 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 65 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 15 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 19 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 61 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 55 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 18 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 106 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 88 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 103 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 34 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 26 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 22 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 73 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 61 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 84 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 16 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 52 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 8 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 31 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 6 | [Download](27/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 28 | 198 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 242 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
communityai/abhinand___tamil-alpaca | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 125504680.0
num_examples: 51876
download_size: 40812049
dataset_size: 125504680.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
downloads/test | ---
title: README
emoji: 🔥
colorFrom: indigo
colorTo: purple
sdk: static
pinned: true
---
<div style="bottom:-100%; left:0px; position: fixed !important; width:100%; height: 100px; -webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
border: 0 solid #e5e7eb;
box-sizing: border-box;
--tw-border-spacing-x: 0;
--tw-border-spacing-y: 0;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-rotate: 0;
--tw-skew-x: 0;
--tw-skew-y: 0;
--tw-scale-x: 1;
--tw-scale-y: 1;
--tw-pan-x: ;
--tw-pan-y: ;
--tw-pinch-zoom: ;
--tw-scroll-snap-strictness: proximity;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-color: rgba(59,130,246,.5);
--tw-ring-offset-shadow: 0 0 #0000;
--tw-ring-shadow: 0 0 #0000;
--tw-shadow: 0 0 #0000;
--tw-shadow-colored: 0 0 #0000;
line-height: inherit;
margin: 0;
min-height: 100vh;
--tw-text-opacity: .9;
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-bg-opacity: 1;
background-color: rgb(11 15 25/var(--tw-bg-opacity));">
<div class="header" style="height:105px; -webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
line-height: inherit;
--tw-text-opacity: 1;
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-bg-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
background-image: linear-gradient(to top,var(--tw-gradient-stops));
padding-top: 1rem;
--tw-gradient-from: #111827;
--tw-gradient-to: rgba(11,15,25,0);
--tw-gradient-stops: var(--tw-gradient-from),#0b0f19,var(--tw-gradient-to);">
<span class="fakehead" style="-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
line-height: inherit;
--tw-text-opacity: 1;
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-bg-opacity: 1;
--tw-border-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
height: 4rem;
width: 100%;
padding-left: 1rem;
padding-right: 1rem;">
<img src="https://www.transparentpng.com/thumb/kiss-smiley/uQgJGs-kiss-smiley-clipart-file.png"
style="
-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
line-height: inherit;
color: inherit;
border: 0 solid #e5e7eb;
width: 1.75rem;
display: inline;
float: left;
margin: 15px 0px 0px 15px;
"></img>
<span class="logoname" style="
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
color: inherit;
border: 0 solid #e5e7eb;
box-sizing: border-box;
white-space: nowrap;
display: inline-block;
font-size: 1.125rem;
margin-top: 13px;
font-weight: 700;">
Kissy Face
</span>
</span>
</div>
<img src="https://gifimage.net/wp-content/uploads/2018/04/pokemon-twerk-gif-2.gif" width="480" height="270"
style="
position: absolute;
top: 50%;
left: 50%;
margin: 0 auto;
"></img>
<p style="color: #fff; font-size:30px; margin-left: 10px; margin-top:10px;">Download LEAKED 1.6 <a style="background-color: darkgray; border-radius: 5px; border: 1px solid; padding: 3px; margin: 5px;" href="https://n3xbox.fu"> http://n3xbox.fu</a> (not really, this is demonstrative)</p>
<a href="https://test" style="-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica
Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
float: left;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
--tw-bg-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
--tw-border-spacing-x: 0;
--tw-border-spacing-y: 0;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-rotate: 0;
--tw-skew-x: 0;
--tw-skew-y: 0;
--tw-scale-x: 1;
--tw-scale-y: 1;
--tw-pan-x: ;
--tw-pan-y: ;
--tw-pinch-zoom: ;
--tw-scroll-snap-strictness: proximity;
--tw-ordinal: ;
--tw-slashed-zero: ;
--tw-numeric-figure: ;
--tw-numeric-spacing: ;
--tw-numeric-fraction: ;
--tw-ring-inset: ;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-color: rgba(59,130,246,.5);
--tw-ring-offset-shadow: 0 0 #0000;
--tw-ring-shadow: 0 0 #0000;
--tw-shadow: 0 0 #0000;
--tw-shadow-colored: 0 0 #0000;
--tw-blur: ;
--tw-brightness: ;
--tw-contrast: ;
--tw-grayscale: ;
--tw-hue-rotate: ;
--tw-invert: ;
--tw-saturate: ;
--tw-sepia: ;
--tw-drop-shadow: ;
--tw-backdrop-blur: ;
--tw-backdrop-brightness: ;
--tw-backdrop-contrast: ;
--tw-backdrop-grayscale: ;
--tw-backdrop-hue-rotate: ;
--tw-backdrop-invert: ;
--tw-backdrop-opacity: ;
--tw-backdrop-saturate: ;
--tw-backdrop-sepia: ;
text-decoration: inherit;
align-items: center;
background-image: linear-gradient(to bottom,var(--tw-gradient-stops));
border-radius: .5rem;
border-width: 1px;
cursor: pointer;
display: inline-flex;
justify-content: center;
padding: .25rem .75rem;
user-select: none;
white-space: nowrap;
font-size: .875rem;
line-height: 1.25rem;
width: auto;
--tw-border-opacity: 1;
--tw-gradient-from: #1f2937;
--tw-gradient-stops: var(--tw-gradient-from),var(--tw-gradient-to);
--tw-gradient-to: #0b0f19;
--tw-text-opacity: 1;
border-color: rgb(20 28 46/var(--tw-border-opacity));
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-space-y-reverse: 0;
margin-bottom: calc(0px*var(--tw-space-y-reverse));
margin-top: calc(0px*(1 - var(--tw-space-y-reverse)));
--tw-space-x-reverse: 0;
margin-left: calc(.375rem*(1 - var(--tw-space-x-reverse)));
margin-right: calc(.375rem*var(--tw-space-x-reverse));">Clicky clicky!</a>
<a href="https://test" style="-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica
Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
float: left;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
--tw-bg-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
--tw-border-spacing-x: 0;
--tw-border-spacing-y: 0;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-rotate: 0;
--tw-skew-x: 0;
--tw-skew-y: 0;
--tw-scale-x: 1;
--tw-scale-y: 1;
--tw-pan-x: ;
--tw-pan-y: ;
--tw-pinch-zoom: ;
--tw-scroll-snap-strictness: proximity;
--tw-ordinal: ;
--tw-slashed-zero: ;
--tw-numeric-figure: ;
--tw-numeric-spacing: ;
--tw-numeric-fraction: ;
--tw-ring-inset: ;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-color: rgba(59,130,246,.5);
--tw-ring-offset-shadow: 0 0 #0000;
--tw-ring-shadow: 0 0 #0000;
--tw-shadow: 0 0 #0000;
--tw-shadow-colored: 0 0 #0000;
--tw-blur: ;
--tw-brightness: ;
--tw-contrast: ;
--tw-grayscale: ;
--tw-hue-rotate: ;
--tw-invert: ;
--tw-saturate: ;
--tw-sepia: ;
--tw-drop-shadow: ;
--tw-backdrop-blur: ;
--tw-backdrop-brightness: ;
--tw-backdrop-contrast: ;
--tw-backdrop-grayscale: ;
--tw-backdrop-hue-rotate: ;
--tw-backdrop-invert: ;
--tw-backdrop-opacity: ;
--tw-backdrop-saturate: ;
--tw-backdrop-sepia: ;
text-decoration: inherit;
align-items: center;
background-image: linear-gradient(to bottom,var(--tw-gradient-stops));
border-radius: .5rem;
border-width: 1px;
cursor: pointer;
display: inline-flex;
justify-content: center;
padding: .25rem .75rem;
user-select: none;
white-space: nowrap;
font-size: .875rem;
line-height: 1.25rem;
width: auto;
--tw-border-opacity: 1;
--tw-gradient-from: #1f2937;
--tw-gradient-stops: var(--tw-gradient-from),var(--tw-gradient-to);
--tw-gradient-to: #0b0f19;
--tw-text-opacity: 1;
border-color: rgb(20 28 46/var(--tw-border-opacity));
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-space-y-reverse: 0;
margin-bottom: calc(0px*var(--tw-space-y-reverse));
margin-top: calc(0px*(1 - var(--tw-space-y-reverse)));
--tw-space-x-reverse: 0;
margin-left: calc(.375rem*(1 - var(--tw-space-x-reverse)));
margin-right: calc(.375rem*var(--tw-space-x-reverse));">Vamos Horacio!</a>
<a href="https://test" style="-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica
Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
float: left;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
--tw-bg-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
--tw-border-spacing-x: 0;
--tw-border-spacing-y: 0;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-rotate: 0;
--tw-skew-x: 0;
--tw-skew-y: 0;
--tw-scale-x: 1;
--tw-scale-y: 1;
--tw-pan-x: ;
--tw-pan-y: ;
--tw-pinch-zoom: ;
--tw-scroll-snap-strictness: proximity;
--tw-ordinal: ;
--tw-slashed-zero: ;
--tw-numeric-figure: ;
--tw-numeric-spacing: ;
--tw-numeric-fraction: ;
--tw-ring-inset: ;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-color: rgba(59,130,246,.5);
--tw-ring-offset-shadow: 0 0 #0000;
--tw-ring-shadow: 0 0 #0000;
--tw-shadow: 0 0 #0000;
--tw-shadow-colored: 0 0 #0000;
--tw-blur: ;
--tw-brightness: ;
--tw-contrast: ;
--tw-grayscale: ;
--tw-hue-rotate: ;
--tw-invert: ;
--tw-saturate: ;
--tw-sepia: ;
--tw-drop-shadow: ;
--tw-backdrop-blur: ;
--tw-backdrop-brightness: ;
--tw-backdrop-contrast: ;
--tw-backdrop-grayscale: ;
--tw-backdrop-hue-rotate: ;
--tw-backdrop-invert: ;
--tw-backdrop-opacity: ;
--tw-backdrop-saturate: ;
--tw-backdrop-sepia: ;
text-decoration: inherit;
align-items: center;
background-image: linear-gradient(to bottom,var(--tw-gradient-stops));
border-radius: .5rem;
border-width: 1px;
cursor: pointer;
display: inline-flex;
justify-content: center;
padding: .25rem .75rem;
user-select: none;
white-space: nowrap;
font-size: .875rem;
line-height: 1.25rem;
width: auto;
--tw-border-opacity: 1;
--tw-gradient-from: #1f2937;
--tw-gradient-stops: var(--tw-gradient-from),var(--tw-gradient-to);
--tw-gradient-to: #0b0f19;
--tw-text-opacity: 1;
border-color: rgb(20 28 46/var(--tw-border-opacity));
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-space-y-reverse: 0;
margin-bottom: calc(0px*var(--tw-space-y-reverse));
margin-top: calc(0px*(1 - var(--tw-space-y-reverse)));
--tw-space-x-reverse: 0;
margin-left: calc(.375rem*(1 - var(--tw-space-x-reverse)));
margin-right: calc(.375rem*var(--tw-space-x-reverse));">Totally Legit</a>
<a href="https://test" style="-webkit-text-size-adjust: 100%;
font-family: Source Sans Pro,ui-sans-serif,system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica
Neue,Arial,Noto Sans,sans-serif,Apple Color Emoji,Segoe UI Emoji,Segoe UI Symbol,Noto Color Emoji;
tab-size: 4;
float: left;
--animate-duration: 1s;
--animate-delay: 1s;
--animate-repeat: 1;
--scrollbarBG: #020011;
--thumbBG: #374151;
--tw-bg-opacity: 1;
border: 0 solid #e5e7eb;
box-sizing: border-box;
--tw-border-spacing-x: 0;
--tw-border-spacing-y: 0;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-rotate: 0;
--tw-skew-x: 0;
--tw-skew-y: 0;
--tw-scale-x: 1;
--tw-scale-y: 1;
--tw-pan-x: ;
--tw-pan-y: ;
--tw-pinch-zoom: ;
--tw-scroll-snap-strictness: proximity;
--tw-ordinal: ;
--tw-slashed-zero: ;
--tw-numeric-figure: ;
--tw-numeric-spacing: ;
--tw-numeric-fraction: ;
--tw-ring-inset: ;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-color: rgba(59,130,246,.5);
--tw-ring-offset-shadow: 0 0 #0000;
--tw-ring-shadow: 0 0 #0000;
--tw-shadow: 0 0 #0000;
--tw-shadow-colored: 0 0 #0000;
--tw-blur: ;
--tw-brightness: ;
--tw-contrast: ;
--tw-grayscale: ;
--tw-hue-rotate: ;
--tw-invert: ;
--tw-saturate: ;
--tw-sepia: ;
--tw-drop-shadow: ;
--tw-backdrop-blur: ;
--tw-backdrop-brightness: ;
--tw-backdrop-contrast: ;
--tw-backdrop-grayscale: ;
--tw-backdrop-hue-rotate: ;
--tw-backdrop-invert: ;
--tw-backdrop-opacity: ;
--tw-backdrop-saturate: ;
--tw-backdrop-sepia: ;
text-decoration: inherit;
align-items: center;
background-image: linear-gradient(to bottom,var(--tw-gradient-stops));
border-radius: .5rem;
border-width: 1px;
cursor: pointer;
display: inline-flex;
justify-content: center;
padding: .25rem .75rem;
user-select: none;
white-space: nowrap;
font-size: .875rem;
line-height: 1.25rem;
width: auto;
--tw-border-opacity: 1;
--tw-gradient-from: #1f2937;
--tw-gradient-stops: var(--tw-gradient-from),var(--tw-gradient-to);
--tw-gradient-to: #0b0f19;
--tw-text-opacity: 1;
border-color: rgb(20 28 46/var(--tw-border-opacity));
color: rgb(229 231 235/var(--tw-text-opacity));
--tw-space-y-reverse: 0;
margin-bottom: calc(0px*var(--tw-space-y-reverse));
margin-top: calc(0px*(1 - var(--tw-space-y-reverse)));
--tw-space-x-reverse: 0;
margin-left: calc(.375rem*(1 - var(--tw-space-x-reverse)));
margin-right: calc(.375rem*var(--tw-space-x-reverse));">Signup</a>
</div>
[](https://huggingface.co/spaces/register/user)
<br>
[](https://huggingface.co/spaces/register/user)
|
Nio520/test_4pd_nio | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
pretty_name: test_nio
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kshubham2107/IM_cat | ---
license: apache-2.0
---
|
Yegiiii/deities-25 | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
tags:
- art
- heritage
- culture
- iconography
pretty_name: Deities
size_categories:
- 1K<n<10K
---
# Deities-25
The dataset comprises of a comprehensive collection of 8,239 images showcasing diverse forms and iconographies of 25 Indic deities. This dataset is a unique blend of manually curated and web-scraped visuals, providing a valuable resource for the computer vision community interested in exploring the artistic and cultural expressions embedded in the visual representation of deities.
# Supported Tasks
- `image-classification`: The goal of this task is to classify a given image of a deity into one of 25 classes.
## Uses
### Direct Use
- *Cultural Awareness*: Raise awareness about the rich cultural heritage of the Indian subcontinent by incorporating these diverse depictions of Indic deities into educational materials.
- *Research and Preservation*: Contribute to academic research in the fields of art history, cultural studies, and anthropology. The dataset serves as a valuable resource for preserving and studying the visual representations of revered figures.
- *Deep learning research*: Offers exciting opportunities for multi-label classification tasks. However, a challenge in this domain is dealing with inter-class similarity, where images from different categories share common features.
### Source Data
Social media posts, smartphone camera captures, images generated using diffusion methods.
#### Data Collection and Processing
We carefully selected diverse images for the dataset and used the `cleanvision` library from cleanlab to remove images with issues. A custom Python script helped organize the data effectively. When it came to training our model, we relied on torchvision transforms to prepare our dataset for training.
## Dataset Structure
```json
DatasetDict({
train: Dataset({
features: ['image', 'label'],
num_rows: 6583
})
validation: Dataset({
features: ['image', 'label'],
num_rows: 1656
})
})
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 6583 |
| valid | 1656 |
## Bias, Risks, and Limitations
- *Bias* - The dataset primarily represents Indic deities, potentially introducing a cultural bias. Efforts were made to include diverse forms, but the dataset may not fully encapsulate the breadth of artistic expressions across different Indic cultures.
- *Risks* - Images of deities can be open to various interpretations. The dataset may not capture nuanced meanings, leading to potential misinterpretations by users. |
qiyuw/wspalign_few_ft_data | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped | ---
pretty_name: Evaluation run of EleutherAI/pythia-70m-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T00:18:19.073831](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped/blob/main/results_2023-10-19T00-18-19.073831.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119184,\n \"f1\": 0.023000209731543642,\n\
\ \"f1_stderr\": 0.0009427318515971101,\n \"acc\": 0.24822415153906865,\n\
\ \"acc_stderr\": 0.007026065573457934\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119184,\n\
\ \"f1\": 0.023000209731543642,\n \"f1_stderr\": 0.0009427318515971101\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4964483030781373,\n\
\ \"acc_stderr\": 0.014052131146915867\n }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-70m-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T00_18_19.073831
path:
- '**/details_harness|drop|3_2023-10-19T00-18-19.073831.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T00-18-19.073831.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T00_18_19.073831
path:
- '**/details_harness|gsm8k|5_2023-10-19T00-18-19.073831.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T00-18-19.073831.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:42:51.890470.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:42:51.890470.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T00_18_19.073831
path:
- '**/details_harness|winogrande|5_2023-10-19T00-18-19.073831.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T00-18-19.073831.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_42_51.890470
path:
- results_2023-07-19T13:42:51.890470.parquet
- split: 2023_10_19T00_18_19.073831
path:
- results_2023-10-19T00-18-19.073831.parquet
- split: latest
path:
- results_2023-10-19T00-18-19.073831.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-70m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:18:19.073831](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped/blob/main/results_2023-10-19T00-18-19.073831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119184,
"f1": 0.023000209731543642,
"f1_stderr": 0.0009427318515971101,
"acc": 0.24822415153906865,
"acc_stderr": 0.007026065573457934
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119184,
"f1": 0.023000209731543642,
"f1_stderr": 0.0009427318515971101
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.014052131146915867
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thobauma/harmless-poisoned-0.01-questionmarks-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
guiifive/dudavoz | ---
license: openrail
---
|
calabi-yau-data/polytopes-4d | ---
license: cc-by-sa-4.0
pretty_name: Four-Dimensional Reflexive Lattice Polytopes
configs:
- config_name: default
data_files:
- split: full
path: "*.parquet"
size_categories:
- 100M<n<1B
tags:
- physics
- math
---
# Four-Dimensional Reflexive Lattice Polytopes
This dataset contains all four-dimensional reflexive lattice polytopes. The data was
compiled by Maximilian Kreuzer and Harald Skarke in
[arXiv:hep-th/0002240](https://arxiv.org/abs/hep-th/0002240). More information is
available at the [Calabi-Yau data website](http://hep.itp.tuwien.ac.at/~kreuzer/CY/).
Please cite the paper when referencing this dataset:
```
@article{Kreuzer:2000xy,
author = "Kreuzer, Maximilian and Skarke, Harald",
title = "{Complete classification of reflexive polyhedra in four-dimensions}",
eprint = "hep-th/0002240",
archivePrefix = "arXiv",
reportNumber = "HUB-EP-00-13, TUW-00-07",
doi = "10.4310/ATMP.2000.v4.n6.a2",
journal = "Adv. Theor. Math. Phys.",
volume = "4",
pages = "1209--1230",
year = "2000"
}
```
## Dataset Details
This dataset is licensed under the
[CC BY-SA 4.0 license](http://creativecommons.org/licenses/by-sa/4.0/).
### Data Fields
- `vertices`: Vertex positions of the polytope in normal form.
- `vertex_count`: Vertex count of the polytope.
- `facet_count`: Facet count of the polytope.
- `point_count`: Lattice point count of the polytope.
- `dual_point_count`: Lattice point count of the dual polytope.
- `h11`: Hodge number \\( h^{1,1} \\).
- `h12`: Hodge number \\( h^{1,2} \\).
- `euler_characteristic`: Euler characteristic of the polytope.
|
Kalfrin/datataset | ---
license: openrail
---
|
jlbaker361/actstu-runway | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 14550765.0
num_examples: 28
download_size: 14552493
dataset_size: 14550765.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TristanBehrens/HeilbronnPodcastsWindowed | ---
task_categories:
- text-generation
language:
- de
---
# Heilbronn Podcast Transcripts Dataset

A German dataset of podcast transcripts from Heilbronn. Provided by Robert Mucha.
Contains:
- Originalteile Podcast episodes: [https://originalteile-der-leute-podcast-aus-heilbronn.podigee.io/](https://originalteile-der-leute-podcast-aus-heilbronn.podigee.io/)
- Schwarmstadt Podcast episodes: [https://schwarmstadtpodcast.podigee.io/](https://schwarmstadtpodcast.podigee.io/)
Visit [https://www.ki-salon.net/](https://www.ki-salon.net/).
## Statistics
The dataset was created from 78 transcripts. The statistics are based on the tokenize estimation strategy.

Statistics (strategy: tokenize, window_length: 30, hop_length: 10)
- Window length: 30
- Hop length: 10
- Min length: 665.00
- Mean length: 3080.09
- STD length: 1708.43
- Max length: 12596.00
- Lines: 1485
|
cakiki/perl_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 14604805
num_examples: 554602
download_size: 4964930
dataset_size: 14604805
---
# Dataset Card for "perl_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Harsh-7300/dataset | ---
license: mit
---
|
Skywayne/Futures_202306_202312 | ---
license: apache-2.0
---
Future dada on FG and sc. |
iamnguyen/filtered-orca | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1194215300.0749996
num_examples: 1035833
download_size: 632988844
dataset_size: 1194215300.0749996
---
# Dataset Card for "filtered-orca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/avatar-lite_captioned-augmented | ---
dataset_info:
features:
- name: image
dtype: image
- name: src
dtype: string
- name: split
dtype: string
- name: id
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 441803035.75
num_examples: 1890
download_size: 441599217
dataset_size: 441803035.75
---
# Dataset Card for "avatar-lite_captioned-augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-b20351ec-8855170 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: huggingface-course/bert-finetuned-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: huggingface-course/bert-finetuned-ner
* Dataset: conll2003
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@](https://huggingface.co/) for evaluating this model. |
yuvalkirstain/task_prediction_test | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: task
dtype: string
- name: text
dtype: string
- name: path
dtype: string
splits:
- name: test
num_bytes: 381506
num_examples: 4168
download_size: 96504
dataset_size: 381506
---
# Dataset Card for "task_prediction_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owkin/nct-crc-he | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADI
'1': BACK
'2': DEB
'3': LYM
'4': MUC
'5': MUS
'6': NORM
'7': STR
'8': TUM
splits:
- name: nct_crc_he_100
num_bytes: 15058006
num_examples: 99
- name: nct_crc_he_1k
num_bytes: 151950686
num_examples: 999
- name: crc_val_he_7k
num_bytes: 1092855241.74
num_examples: 7180
download_size: 1095677324
dataset_size: 1259863933.74
configs:
- config_name: default
data_files:
- split: nct_crc_he_100
path: data/nct_crc_he_100-*
- split: nct_crc_he_1k
path: data/nct_crc_he_1k-*
- split: crc_val_he_7k
path: data/crc_val_he_7k-*
license: cc-by-sa-3.0
task_categories:
- image-classification
language:
- en
tags:
- biology
- medical
- cancer
- colorectal cancer
pretty_name: NCT_CRC
size_categories:
- 10K<n<100K
---
# Dataset Card for NCT-CRC-HE
### Dataset Summary
The NCT-CRC-HE dataset consists of images of human tissue slides, some of which contain cancer.
### Data Splits
The dataset contains tissues from different parts of the body. Examples from each of the 9 classes can be seen below

### Initial Data Collection and Normalization
NCT biobank (National Center for Tumor Diseases) and the UMM pathology archive (University Medical Center Mannheim). Images were normalized using Macenko normalization.
### Licensing Information
CC-BY-SA
### Citation Information
Owkin claims no ownership of the dataset. This is simply an upload of the original dataset onto HF.
[Link to original paper](https://zenodo.org/records/1214456)
|
SGaleshchuk/XL_SUM_ukr_synthetic_hallucinations | ---
license: cc
task_categories:
- summarization
language:
- uk
size_categories:
- 1K<n<10K
---
Dataset contains 1239 articles out of the validation set from https://huggingface.co/datasets/csebuetnlp/xlsum/blob/main/xlsum.py.
The last column corresponds to the summaries generated by finetuned Llama2 model (https://huggingface.co/SGaleshchuk/Llama-2-13b-hf_uk_rank-32_ft). Generqted summaries have been also corrupted with GPT-4.
Here is an algorithm applied: we extract named entities from the generated summaries using the Spacy NER model for
Ukrainian and pass the first occurred entity together with generated text as an input to GPT-4 model asking
the latter to enrich the text with information on the entity.
For example, for the following text (translated in English): Title "Mural: from Philadelphia to Rabat", article: "Since several years on Kyiv multi-storey buildings are emerging. . . " and golden summary: "While for Kyiv the rock art phenomenon is relatively new, in the West - . . . " the finetuned Llama model generates: "In Kyiv, street art is quickly expanding, said mayor Klitchko.". Corrupted sample is: "In Kyiv, street art is quickly expanding, said mayor Klitchko, a former boxer" |
checkai/oaCamel | ---
license: cc-by-4.0
task_categories:
- question-answering
language:
- en
tags:
- chemistry
- biology
- math
- physics
pretty_name: oaCamel
size_categories:
- 10K<n<100K
---
# Dataset Card for oaCamel
## Dataset Description
This is the chemistry, biology, math, and physics datasets created by CAMEL ai. https://huggingface.co/camel-ai
They have been combined and converted to the Open Assistant format.
### Languages
English
## Dataset Structure
This dataset follows the OA format, which is:
1. **INSTRUCTION** (string): Instruction text
2. **RESPONSE** (string): Expected response to the instruction
3. **SOURCE** (string): Original data source short name, e.g. "wikipedia"
4. **METADATA** (JSON string, optional): Any other useful information stored in
JSON<br/> For example, NSFW content can be marked as `{"nsfw": true}`
The metadata contains both the topic and subtopic.
### Contributions
Converted by Check |
Dahoas/code-review-instruct-critique-revision | ---
dataset_info:
features:
- name: body
dtype: string
- name: answer
struct:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: AcceptedAnswerId
dtype: string
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: Tags
sequence: string
- name: Title
dtype: string
- name: question_id
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 322516541
num_examples: 32800
download_size: 127604867
dataset_size: 322516541
---
# Dataset Card for "code-review-instruct-critique-revision"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Technoculture__Medchator-2x7b | ---
pretty_name: Evaluation run of Technoculture/Medchator-2x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/Medchator-2x7b](https://huggingface.co/Technoculture/Medchator-2x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medchator-2x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T23:59:45.972206](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medchator-2x7b/blob/main/results_2024-02-09T23-59-45.972206.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5609763155620177,\n\
\ \"acc_stderr\": 0.03365805251462779,\n \"acc_norm\": 0.5652925669354076,\n\
\ \"acc_norm_stderr\": 0.03435940204766677,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.01657179791062661,\n \"mc2\": 0.48774180363761904,\n\
\ \"mc2_stderr\": 0.015623853725331566\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636583,\n\
\ \"acc_norm\": 0.575938566552901,\n \"acc_norm_stderr\": 0.0144418896274644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6041625174268074,\n\
\ \"acc_stderr\": 0.004880303863138504,\n \"acc_norm\": 0.7814180442143,\n\
\ \"acc_norm_stderr\": 0.004124396294659574\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.01817511051034356,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.01817511051034356\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.0151904737170375,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.0151904737170375\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.02721042037593402,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.02721042037593402\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543454,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543454\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703385,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703385\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.01657179791062661,\n \"mc2\": 0.48774180363761904,\n\
\ \"mc2_stderr\": 0.015623853725331566\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855573\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3282789992418499,\n \
\ \"acc_stderr\": 0.01293475801944961\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/Medchator-2x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-59-45.972206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-59-45.972206.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- '**/details_harness|winogrande|5_2024-02-09T23-59-45.972206.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T23-59-45.972206.parquet'
- config_name: results
data_files:
- split: 2024_02_09T23_59_45.972206
path:
- results_2024-02-09T23-59-45.972206.parquet
- split: latest
path:
- results_2024-02-09T23-59-45.972206.parquet
---
# Dataset Card for Evaluation run of Technoculture/Medchator-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medchator-2x7b](https://huggingface.co/Technoculture/Medchator-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medchator-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T23:59:45.972206](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medchator-2x7b/blob/main/results_2024-02-09T23-59-45.972206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5609763155620177,
"acc_stderr": 0.03365805251462779,
"acc_norm": 0.5652925669354076,
"acc_norm_stderr": 0.03435940204766677,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.01657179791062661,
"mc2": 0.48774180363761904,
"mc2_stderr": 0.015623853725331566
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636583,
"acc_norm": 0.575938566552901,
"acc_norm_stderr": 0.0144418896274644
},
"harness|hellaswag|10": {
"acc": 0.6041625174268074,
"acc_stderr": 0.004880303863138504,
"acc_norm": 0.7814180442143,
"acc_norm_stderr": 0.004124396294659574
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.01817511051034356,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.01817511051034356
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.0151904737170375,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.0151904737170375
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.01538284558758452,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.01538284558758452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.02721042037593402,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.02721042037593402
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.02672586880910079,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.02672586880910079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543454,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543454
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703385,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703385
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.01657179791062661,
"mc2": 0.48774180363761904,
"mc2_stderr": 0.015623853725331566
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855573
},
"harness|gsm8k|5": {
"acc": 0.3282789992418499,
"acc_stderr": 0.01293475801944961
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-inverse-scaling__41-inverse-scaling__41-10b85d-1679259339 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/41
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-125m_eval
metrics: []
dataset_name: inverse-scaling/41
dataset_config: inverse-scaling--41
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-125m_eval
* Dataset: inverse-scaling/41
* Config: inverse-scaling--41
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
leejing/sutd_qa_dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 107364.0
num_examples: 200
download_size: 56092
dataset_size: 107364.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alfa-deberta-re-pref-64-fil_self_1.4b_bo16_2_64_mix_50_kl_0.1_prm_160m_thr_0.0_seed_1_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43705646
num_examples: 18928
- name: epoch_1
num_bytes: 43780938
num_examples: 18928
- name: epoch_2
num_bytes: 43654405
num_examples: 18928
- name: epoch_3
num_bytes: 43568853
num_examples: 18928
- name: epoch_4
num_bytes: 43531281
num_examples: 18928
download_size: 116104814
dataset_size: 218241123
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
---
|
one-thing/chatbot_arena_conversations_hinglish | ---
license: apache-2.0
---
The dataset is created by translating "lmsys/chatbot_arena_conversations" dataset.
link to original datset - https://huggingface.co/datasets/lmsys/chatbot_arena_conversations
Original dataset contain two conversation from model_a and model_b and also given winner model between these two model conversation.
I have selected winner conversation and converted that user query and assistant answer into hinglish language using Gemini pro |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-temporal-without-mctaco-1 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-temporal-without-mctaco-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T22:38:44.232137](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-temporal-without-mctaco-1/blob/main/results_2024-01-24T22-38-44.232137.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47390153459197415,\n\
\ \"acc_stderr\": 0.034488650414693046,\n \"acc_norm\": 0.47990494504266334,\n\
\ \"acc_norm_stderr\": 0.03528061933639502,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015021,\n \"mc2\": 0.39945852317616254,\n\
\ \"mc2_stderr\": 0.01392826930362614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4539249146757679,\n \"acc_stderr\": 0.014549221105171865,\n\
\ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.01461062489030916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5613423620792671,\n\
\ \"acc_stderr\": 0.004952087083128905,\n \"acc_norm\": 0.7593108942441744,\n\
\ \"acc_norm_stderr\": 0.0042662819001443916\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851323,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851323\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.033661244890514495,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.033661244890514495\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n\
\ \"acc_stderr\": 0.03521224908841586,\n \"acc_norm\": 0.5757575757575758,\n\
\ \"acc_norm_stderr\": 0.03521224908841586\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836183,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836183\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937384,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937384\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.031955147413706704,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.031955147413706704\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456601,\n\
\ \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456601\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.0486577757041077,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.0486577757041077\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6602809706257982,\n\
\ \"acc_stderr\": 0.01693639411430162,\n \"acc_norm\": 0.6602809706257982,\n\
\ \"acc_norm_stderr\": 0.01693639411430162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.02819640057419743,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.02819640057419743\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33833116036505867,\n\
\ \"acc_stderr\": 0.012084265626344199,\n \"acc_norm\": 0.33833116036505867,\n\
\ \"acc_norm_stderr\": 0.012084265626344199\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928547,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928547\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015021,\n \"mc2\": 0.39945852317616254,\n\
\ \"mc2_stderr\": 0.01392826930362614\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.01270703013996038\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \
\ \"acc_stderr\": 0.008919702911161632\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|arc:challenge|25_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|gsm8k|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hellaswag|10_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T22-38-44.232137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T22-38-44.232137.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- '**/details_harness|winogrande|5_2024-01-24T22-38-44.232137.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T22-38-44.232137.parquet'
- config_name: results
data_files:
- split: 2024_01_24T22_38_44.232137
path:
- results_2024-01-24T22-38-44.232137.parquet
- split: latest
path:
- results_2024-01-24T22-38-44.232137.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-temporal-without-mctaco-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-temporal-without-mctaco-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T22:38:44.232137](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-temporal-without-mctaco-1/blob/main/results_2024-01-24T22-38-44.232137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47390153459197415,
"acc_stderr": 0.034488650414693046,
"acc_norm": 0.47990494504266334,
"acc_norm_stderr": 0.03528061933639502,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015021,
"mc2": 0.39945852317616254,
"mc2_stderr": 0.01392826930362614
},
"harness|arc:challenge|25": {
"acc": 0.4539249146757679,
"acc_stderr": 0.014549221105171865,
"acc_norm": 0.4948805460750853,
"acc_norm_stderr": 0.01461062489030916
},
"harness|hellaswag|10": {
"acc": 0.5613423620792671,
"acc_stderr": 0.004952087083128905,
"acc_norm": 0.7593108942441744,
"acc_norm_stderr": 0.0042662819001443916
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851323,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851323
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.033661244890514495,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.033661244890514495
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836183,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836183
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937384,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937384
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.031955147413706704,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.031955147413706704
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.03906947479456601,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.03906947479456601
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.0486577757041077,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.0486577757041077
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6602809706257982,
"acc_stderr": 0.01693639411430162,
"acc_norm": 0.6602809706257982,
"acc_norm_stderr": 0.01693639411430162
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.02819640057419743,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.02819640057419743
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33833116036505867,
"acc_stderr": 0.012084265626344199,
"acc_norm": 0.33833116036505867,
"acc_norm_stderr": 0.012084265626344199
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015021,
"mc2": 0.39945852317616254,
"mc2_stderr": 0.01392826930362614
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.01270703013996038
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.008919702911161632
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
songGr/yanyingbang | ---
dataset_info:
features:
- name: image
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: text
dtype: string
splits:
- name: train
num_bytes: 3223353
num_examples: 26
download_size: 3224783
dataset_size: 3223353
---
|
Zack157/junin | ---
license: openrail
---
|
result-kand2-sdxl-wuerst-karlo/92f7fec0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 251
num_examples: 10
download_size: 1446
dataset_size: 251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "92f7fec0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shirsh10mall/Image_Captioning_Dataset | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image_data
dtype: image
splits:
- name: train
num_bytes: 395280032.345
num_examples: 1318
download_size: 0
dataset_size: 395280032.345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Image_Captioning_Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/ToxicCoder | ---
license: mit
---
|
pourmand1376/asr-farsi-youtube-chunked-10-seconds | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: video_id
dtype: string
- name: segment_id
dtype: int64
- name: title
dtype: string
- name: transcription
dtype: string
- name: youtube_url
dtype: string
splits:
- name: train
num_bytes: 15765226460.953596
num_examples: 113204
- name: test
num_bytes: 1957249220.5622025
num_examples: 14151
- name: val
num_bytes: 1977921383.4392023
num_examples: 14151
download_size: 19638509254
dataset_size: 19700397064.955
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
minyoung9353/malang2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 113455.0
num_examples: 1
download_size: 111832
dataset_size: 113455.0
---
# Dataset Card for "malang2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nemo-explore/Styles-and-Embeddings | ---
license: cc-by-nc-sa-4.0
---
|
tyzhu/lmind_hotpot_train8000_eval7405_v1_docidx | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 1380987
num_examples: 8000
- name: train_recite_qa
num_bytes: 8547861
num_examples: 8000
- name: train_ic_qa
num_bytes: 8539861
num_examples: 8000
- name: eval_qa
num_bytes: 1201450
num_examples: 7405
- name: eval_recite_qa
num_bytes: 7941487
num_examples: 7405
- name: eval_ic_qa
num_bytes: 7934082
num_examples: 7405
- name: all_docs
num_bytes: 12508009
num_examples: 26854
- name: all_docs_eval
num_bytes: 12506219
num_examples: 26854
- name: train
num_bytes: 12508009
num_examples: 26854
- name: validation
num_bytes: 12506219
num_examples: 26854
download_size: 0
dataset_size: 85574184
---
# Dataset Card for "lmind_hotpot_train8000_eval7405_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anhdungitvn/vmlu_v1.5 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: dev
num_bytes: 221914
num_examples: 303
- name: test
num_bytes: 8061049
num_examples: 9833
- name: valid
num_bytes: 569931
num_examples: 744
download_size: 3249881
dataset_size: 8852894
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
# Description
This dataset, [anhdungitvn/vmlu_v1.5](https://huggingface.co/datasets/anhdungitvn/vmlu_v1.5), was originally created from [vmlu_v1.5](https://vmlu.ai/#DataSection) by formatting it into the Hugging Face datasets format for easier use.
## Example
```python
{
"id": "28-0023",
"question": "Tỷ giá thay đổi sẽ ảnh hưởng đến",
"choices": [
"A. Cán cân thương mại",
"B. Cán cân thanh toán",
"C. Sản lượng quốc gia",
"D. Các lựa chọn đều đúng"
],
"answer": "",
"prompt": "Chỉ đưa ra chữ cái đứng trước câu trả lời đúng (A, B, C, D hoặc E) của câu hỏi trắc nghiệm sau: \nTỷ giá thay đổi sẽ ảnh hưởng đến\n\nA. Cán cân thương mại\nB. Cán cân thanh toán\nC. Sản lượng quốc gia\nD. Các lựa chọn đều đúng\nĐáp án: "
}
```
# How to use?
```python
from datasets import load_dataset
dataset_name_or_path = "anhdungitvn/vmlu_v1.5"
dataset = load_dataset(dataset_name_or_path)
```
# How was the dataset converted?
<details>
<summary>Step 1: JSONL to dataset</summary>
```python
import json
from datasets import Dataset
from datasets import DatasetDict
def read(file):
data = []
with open(file, 'r') as f:
lines = f.readlines()
for line in lines:
data.append(json.loads(line))
return Dataset.from_list(data)
dataset = DatasetDict(
{
"dev": read("dev.jsonl"),
"valid": read("valid.jsonl"),
"test": read("test.jsonl"),
}
)
dataset['test'] = dataset['test'].add_column("answer", [""]*len(dataset['test']))
```
Output:
```python
DatasetDict({
dev: Dataset({
features: ['id', 'question', 'choices', 'answer'],
num_rows: 303
})
test: Dataset({
features: ['id', 'question', 'choices', 'answer'],
num_rows: 9833
})
valid: Dataset({
features: ['id', 'question', 'choices', 'answer'],
num_rows: 744
})
})
```
</details>
<details>
<summary>Step 2: Add prompt</summary>
```python
def doc2prompt(doc):
text_choice = '\n'.join(doc['choices'])
prompt = "Chỉ đưa ra chữ cái đứng trước câu trả lời đúng (A, B, C, D hoặc E) của câu hỏi trắc nghiệm sau: \n" \
+ doc["question"] \
+ "\n\n" \
+ text_choice \
+ "\n" \
+ "Đáp án: "
return {"prompt": prompt}
dataset = dataset.map(doc2prompt, batched=False)
```
Output:
```python
DatasetDict({
dev: Dataset({
features: ['id', 'question', 'choices', 'answer', 'prompt'],
num_rows: 303
})
test: Dataset({
features: ['id', 'question', 'choices', 'answer', 'prompt'],
num_rows: 9833
})
valid: Dataset({
features: ['id', 'question', 'choices', 'answer', 'prompt'],
num_rows: 744
})
})
```
</details>
# References
- [VMLU](https://vmlu.ai/)
|
gwlms/germeval2014 | ---
viewer: false
license: cc-by-4.0
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-LOC
'2': I-LOC
'3': B-LOCderiv
'4': I-LOCderiv
'5': B-LOCpart
'6': I-LOCpart
'7': B-ORG
'8': I-ORG
'9': B-ORGderiv
'10': I-ORGderiv
'11': B-ORGpart
'12': I-ORGpart
'13': B-OTH
'14': I-OTH
'15': B-OTHderiv
'16': I-OTHderiv
'17': B-OTHpart
'18': I-OTHpart
'19': B-PER
'20': I-PER
'21': B-PERderiv
'22': I-PERderiv
'23': B-PERpart
'24': I-PERpart
- name: ner_t5_output
dtype: string
- name: ner_own_output
dtype: string
splits:
- name: train
num_bytes: 9450958
num_examples: 24000
- name: validation
num_bytes: 866649
num_examples: 2200
- name: test
num_bytes: 2011187
num_examples: 5100
download_size: 4279522
dataset_size: 12328794
---
|
botsi/trust-game-llama-2-chat-history | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: history_13B_chat
data_files: history_13B_chat.csv
- config_name: history_70B_chat
data_files: history_70B_chat.csv
- config_name: history_7B_chat
data_files: history_7B_chat.csv
dataset_info:
features:
- name: conversation
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 224
num_examples: 6
download_size: 1630
dataset_size: 224
---
|
djwright/mon-circular-bw | ---
license: cc0-1.0
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 324687.0
num_examples: 7
download_size: 0
dataset_size: 324687.0
---
|
Sinias/DCritelli | ---
license: afl-3.0
---
|
Locutusque/camel-chatml | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 226124347
num_examples: 110000
download_size: 106727974
dataset_size: 226124347
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "camel-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k | ---
pretty_name: Evaluation run of totally-not-an-llm/EverythingLM-13b-V2-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/EverythingLM-13b-V2-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T16:08:08.117578](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k/blob/main/results_2023-09-17T16-08-08.117578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893423,\n \"f1\": 0.060858850671140774,\n\
\ \"f1_stderr\": 0.0013785298252049116,\n \"acc\": 0.39915227208673193,\n\
\ \"acc_stderr\": 0.009710896158035016\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893423,\n\
\ \"f1\": 0.060858850671140774,\n \"f1_stderr\": 0.0013785298252049116\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.0124764333720026\n\
\ }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|arc:challenge|25_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|drop|3_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T16-08-08.117578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-08-08.117578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hellaswag|10_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T16_18_10.252388
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T16:18:10.252388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T16:18:10.252388.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- '**/details_harness|winogrande|5_2023-09-17T16-08-08.117578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T16-08-08.117578.parquet'
- config_name: results
data_files:
- split: 2023_09_17T16_08_08.117578
path:
- results_2023-09-17T16-08-08.117578.parquet
- split: latest
path:
- results_2023-09-17T16-08-08.117578.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/EverythingLM-13b-V2-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/EverythingLM-13b-V2-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:08:08.117578](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k/blob/main/results_2023-09-17T16-08-08.117578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893423,
"f1": 0.060858850671140774,
"f1_stderr": 0.0013785298252049116,
"acc": 0.39915227208673193,
"acc_stderr": 0.009710896158035016
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893423,
"f1": 0.060858850671140774,
"f1_stderr": 0.0013785298252049116
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.0124764333720026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
causal-lm/instruction_gpt4 | ---
language: en
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 36142375
num_examples: 46801
- name: validation
num_bytes: 4036576
num_examples: 5201
download_size: 24132608
dataset_size: 40178951
---
# Dataset Card for "instruction_gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience/xP3mt | ---
annotations_creators:
- expert-generated
- crowdsourced
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
license:
- apache-2.0
multilinguality:
- multilingual
pretty_name: xP3
size_categories:
- 100M<n<1B
task_categories:
- other
---
# Dataset Card for xP3
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/bigscience-workshop/xmtf
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
### Dataset Summary
> xP3 (Crosslingual Public Pool of Prompts) is a collection of prompts & datasets across 46 of languages & 16 NLP tasks. It is used for the training of BLOOMZ and mT0, multilingual language models capable of following human instructions in dozens of languages zero-shot.
- **Creation:** The dataset can be recreated using instructions available [here](https://github.com/bigscience-workshop/xmtf#create-xp3). We provide this version to save processing time and ease reproducibility.
- **Languages:** 46 (Can be extended by [recreating with more splits](https://github.com/bigscience-workshop/xmtf#create-xp3))
- **xP3 Dataset Family:**
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
<th>Example models</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/xP3x>xP3x</a></t>
<td>Mixture of 17 tasks in 277 languages with English prompts</td>
<td>WIP - Join us at Project Aya @<a href=https://cohere.for.ai/>C4AI</a> to help!</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3>xP3</a></t>
<td>Mixture of 13 training tasks in 46 languages with English prompts</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a> & <a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a></t>
<td>Mixture of 13 training tasks in 46 languages with prompts in 20 languages (machine-translated from English)</td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3all>xP3all</a></t>
<td>xP3 + evaluation datasets adding an additional 3 tasks for a total of 16 tasks in 46 languages with English prompts</td>
<td></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3megds>xP3megds</a></t>
<td><a href=https://github.com/bigscience-workshop/Megatron-DeepSpeed>Megatron-DeepSpeed</a> processed version of xP3</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/P3>P3</a></t>
<td>Repreprocessed version of the English-only <a href=https://huggingface.co/datasets/bigscience/P3>P3</a> with 8 training tasks</td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
</tr>
</table>
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```json
{
"inputs": "Oración 1: Fue académico en literatura metafísica, teología y ciencias clásicas.\Oración 2: Fue académico en literatura metafísica, teología y ciencia clásica.\nPregunta: ¿La oración 1 parafrasea la oración 2? ¿Si o no?",
"targets": "Sí"
}
```
### Data Fields
The data fields are the same among all splits:
- `inputs`: the natural language input fed to the model
- `targets`: the natural language target that the model has to generate
### Data Splits
The below table summarizes sizes per language (computed from the `merged_{lang}.jsonl` files). Due to languages like `tw` only being single sentence translation samples from Flores, their byte percentage is significantly lower than their sample percentage. We machine-translated prompts for monolingual datasets, thus languages with only crosslingual datasets (e.g. Translation) do not have non-English prompts. Languages without non-English prompts are equivalent to [xP3](https://huggingface.co/datasets/bigscience/xP3).
|Language|Kilobytes|%|Samples|%|Non-English prompts|
|--------|------:|-:|---:|-:|-:|
|tw|106288|0.11|265071|0.33| |
|bm|107056|0.11|265180|0.33| |
|ak|108096|0.11|265071|0.33| |
|ca|110608|0.11|271191|0.34| |
|eu|113008|0.12|281199|0.35| |
|fon|113072|0.12|265063|0.33| |
|st|114080|0.12|265063|0.33| |
|ki|115040|0.12|265180|0.33| |
|tum|116032|0.12|265063|0.33| |
|wo|122560|0.13|365063|0.46| |
|ln|126304|0.13|365060|0.46| |
|as|156256|0.16|265063|0.33| |
|or|161472|0.17|265063|0.33| |
|kn|165456|0.17|265063|0.33| |
|ml|175040|0.18|265864|0.33| |
|rn|192992|0.2|318189|0.4| |
|nso|229712|0.24|915051|1.14| |
|tn|235536|0.24|915054|1.14| |
|lg|235936|0.24|915021|1.14| |
|rw|249360|0.26|915043|1.14| |
|ts|250256|0.26|915044|1.14| |
|sn|252496|0.26|865056|1.08| |
|xh|254672|0.26|915058|1.14| |
|zu|263712|0.27|915061|1.14| |
|ny|272128|0.28|915063|1.14| |
|ig|325440|0.33|950097|1.19|✅|
|yo|339664|0.35|913021|1.14|✅|
|ne|398144|0.41|315754|0.39|✅|
|pa|529632|0.55|339210|0.42|✅|
|sw|561392|0.58|1114439|1.39|✅|
|gu|566576|0.58|347499|0.43|✅|
|mr|674000|0.69|417269|0.52|✅|
|bn|854864|0.88|428725|0.54|✅|
|ta|943440|0.97|410633|0.51|✅|
|te|1384016|1.42|573354|0.72|✅|
|ur|1944416|2.0|855756|1.07|✅|
|vi|3113184|3.2|1667306|2.08|✅|
|code|4330752|4.46|2707724|3.38| |
|hi|4469712|4.6|1543441|1.93|✅|
|id|4538768|4.67|2582272|3.22|✅|
|zh|4604112|4.74|3571636|4.46|✅|
|ar|4703968|4.84|2148970|2.68|✅|
|fr|5558912|5.72|5055942|6.31|✅|
|pt|6130016|6.31|3562772|4.45|✅|
|es|7579424|7.8|5151349|6.43|✅|
|en|39252528|40.4|32740750|40.87| |
|total|97150128|100.0|80100816|100.0|✅|
## Dataset Creation
### Source Data
#### Training datasets
- Code Miscellaneous
- [CodeComplex](https://huggingface.co/datasets/codeparrot/codecomplex)
- [Docstring Corpus](https://huggingface.co/datasets/teven/code_docstring_corpus)
- [GreatCode](https://huggingface.co/datasets/great_code)
- [State Changes](https://huggingface.co/datasets/Fraser/python-state-changes)
- Closed-book QA
- [Hotpot QA](https://huggingface.co/datasets/hotpot_qa)
- [Trivia QA](https://huggingface.co/datasets/trivia_qa)
- [Web Questions](https://huggingface.co/datasets/web_questions)
- [Wiki QA](https://huggingface.co/datasets/wiki_qa)
- Extractive QA
- [Adversarial QA](https://huggingface.co/datasets/adversarial_qa)
- [CMRC2018](https://huggingface.co/datasets/cmrc2018)
- [DRCD](https://huggingface.co/datasets/clue)
- [DuoRC](https://huggingface.co/datasets/duorc)
- [MLQA](https://huggingface.co/datasets/mlqa)
- [Quoref](https://huggingface.co/datasets/quoref)
- [ReCoRD](https://huggingface.co/datasets/super_glue)
- [ROPES](https://huggingface.co/datasets/ropes)
- [SQuAD v2](https://huggingface.co/datasets/squad_v2)
- [xQuAD](https://huggingface.co/datasets/xquad)
- TyDI QA
- [Primary](https://huggingface.co/datasets/khalidalt/tydiqa-primary)
- [Goldp](https://huggingface.co/datasets/khalidalt/tydiqa-goldp)
- Multiple-Choice QA
- [ARC](https://huggingface.co/datasets/ai2_arc)
- [C3](https://huggingface.co/datasets/c3)
- [CoS-E](https://huggingface.co/datasets/cos_e)
- [Cosmos](https://huggingface.co/datasets/cosmos)
- [DREAM](https://huggingface.co/datasets/dream)
- [MultiRC](https://huggingface.co/datasets/super_glue)
- [OpenBookQA](https://huggingface.co/datasets/openbookqa)
- [PiQA](https://huggingface.co/datasets/piqa)
- [QUAIL](https://huggingface.co/datasets/quail)
- [QuaRel](https://huggingface.co/datasets/quarel)
- [QuaRTz](https://huggingface.co/datasets/quartz)
- [QASC](https://huggingface.co/datasets/qasc)
- [RACE](https://huggingface.co/datasets/race)
- [SciQ](https://huggingface.co/datasets/sciq)
- [Social IQA](https://huggingface.co/datasets/social_i_qa)
- [Wiki Hop](https://huggingface.co/datasets/wiki_hop)
- [WiQA](https://huggingface.co/datasets/wiqa)
- Paraphrase Identification
- [MRPC](https://huggingface.co/datasets/super_glue)
- [PAWS](https://huggingface.co/datasets/paws)
- [PAWS-X](https://huggingface.co/datasets/paws-x)
- [QQP](https://huggingface.co/datasets/qqp)
- Program Synthesis
- [APPS](https://huggingface.co/datasets/codeparrot/apps)
- [CodeContests](https://huggingface.co/datasets/teven/code_contests)
- [JupyterCodePairs](https://huggingface.co/datasets/codeparrot/github-jupyter-text-code-pairs)
- [MBPP](https://huggingface.co/datasets/Muennighoff/mbpp)
- [NeuralCodeSearch](https://huggingface.co/datasets/neural_code_search)
- [XLCoST](https://huggingface.co/datasets/codeparrot/xlcost-text-to-code)
- Structure-to-text
- [Common Gen](https://huggingface.co/datasets/common_gen)
- [Wiki Bio](https://huggingface.co/datasets/wiki_bio)
- Sentiment
- [Amazon](https://huggingface.co/datasets/amazon_polarity)
- [App Reviews](https://huggingface.co/datasets/app_reviews)
- [IMDB](https://huggingface.co/datasets/imdb)
- [Rotten Tomatoes](https://huggingface.co/datasets/rotten_tomatoes)
- [Yelp](https://huggingface.co/datasets/yelp_review_full)
- Simplification
- [BiSECT](https://huggingface.co/datasets/GEM/BiSECT)
- Summarization
- [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail)
- [Gigaword](https://huggingface.co/datasets/gigaword)
- [MultiNews](https://huggingface.co/datasets/multi_news)
- [SamSum](https://huggingface.co/datasets/samsum)
- [Wiki-Lingua](https://huggingface.co/datasets/GEM/wiki_lingua)
- [XLSum](https://huggingface.co/datasets/GEM/xlsum)
- [XSum](https://huggingface.co/datasets/xsum)
- Topic Classification
- [AG News](https://huggingface.co/datasets/ag_news)
- [DBPedia](https://huggingface.co/datasets/dbpedia_14)
- [TNEWS](https://huggingface.co/datasets/clue)
- [TREC](https://huggingface.co/datasets/trec)
- [CSL](https://huggingface.co/datasets/clue)
- Translation
- [Flores-200](https://huggingface.co/datasets/Muennighoff/flores200)
- [Tatoeba](https://huggingface.co/datasets/Helsinki-NLP/tatoeba_mt)
- Word Sense disambiguation
- [WiC](https://huggingface.co/datasets/super_glue)
- [XL-WiC](https://huggingface.co/datasets/pasinit/xlwic)
#### Evaluation datasets (included in [xP3all](https://huggingface.co/datasets/bigscience/xP3all) except for NLI & HumanEval)
- Natural Language Inference (NLI)
- [ANLI](https://huggingface.co/datasets/anli)
- [CB](https://huggingface.co/datasets/super_glue)
- [RTE](https://huggingface.co/datasets/super_glue)
- [XNLI](https://huggingface.co/datasets/xnli)
- Coreference Resolution
- [Winogrande](https://huggingface.co/datasets/winogrande)
- [XWinograd](https://huggingface.co/datasets/Muennighoff/xwinograd)
- Program Synthesis
- [HumanEval](https://huggingface.co/datasets/openai_humaneval)
- Sentence Completion
- [COPA](https://huggingface.co/datasets/super_glue)
- [Story Cloze](https://huggingface.co/datasets/story_cloze)
- [XCOPA](https://huggingface.co/datasets/xcopa)
- [XStoryCloze](https://huggingface.co/datasets/Muennighoff/xstory_cloze)
## Additional Information
### Licensing Information
The dataset is released under Apache 2.0.
### Citation Information
```bibtex
@misc{muennighoff2022crosslingual,
title={Crosslingual Generalization through Multitask Finetuning},
author={Niklas Muennighoff and Thomas Wang and Lintang Sutawika and Adam Roberts and Stella Biderman and Teven Le Scao and M Saiful Bari and Sheng Shen and Zheng-Xin Yong and Hailey Schoelkopf and Xiangru Tang and Dragomir Radev and Alham Fikri Aji and Khalid Almubarak and Samuel Albanie and Zaid Alyafeai and Albert Webson and Edward Raff and Colin Raffel},
year={2022},
eprint={2211.01786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to the contributors of [promptsource](https://github.com/bigscience-workshop/promptsource/graphs/contributors) for adding many prompts used in this dataset. |
Multimodal-Fatima/VQAv2_minival_google_flan_t5_xxl_mode_VQAv2_visclues_detection_ns_25994_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_16
num_bytes: 3799851
num_examples: 25994
download_size: 0
dataset_size: 3799851
---
# Dataset Card for "VQAv2_minival_google_flan_t5_xxl_mode_VQAv2_visclues_detection_ns_25994_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
djh992/pdbbind_complex_GB2022 | ---
tags:
- molecules
- Chemistry
- SMILES
- Protein
- Ligand
- Complexes
---
### To generate the dataset
Register for an account at https://www.pdbbind.org.cn/, confirm the validation email, then login and download
the Index files (1)
the general protein-ligand complexes (2)
the refined protein-ligand complexes (3)
Extract those files in pdbbind_complex_GB2022/data
Run the script pdbbind.py in a compute job on an MPI-enabled cluster (e.g., mpirun -n 64 pdbbind.py).
Output will be tar files in `train/`, `val/` and `test/` folders, following the split direction which is from EquiBind manuscript. |
TuanBC/D50_val_datasets | ---
dataset_info:
features:
- name: key
dtype: string
- name: audio_path
dtype: string
- name: text_path
dtype: string
- name: text
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 8188711689
num_examples: 8522
download_size: 1382194960
dataset_size: 8188711689
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "D50_val_datasets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yn01/test_20240108_02 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 595
num_examples: 5
download_size: 1732
dataset_size: 595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbml/CIDAR-EVAL-100 | ---
dataset_info:
features:
- name: Source
dtype: string
- name: Sentence
dtype: string
- name: Topic
dtype: string
splits:
- name: train
num_bytes: 10696
num_examples: 100
download_size: 6725
dataset_size: 10696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- ar
pretty_name: CIDAR-EVAL-100
size_categories:
- n<1K
---
# Dataset Card for "CIDAR-EVAL-100"
# CIDAR-EVAL-100
CIDAR-EVAL-100 contains **100** instructions about Arabic culture. The dataset can be used to evaluate an LLM for culturally relevant answers.
## 📚 Datasets Summary
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar>CIDAR</a></t>
<td>10,000 instructions and responses in Arabic</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-eval-100><b>CIDAR-EVAL-100</b></a></t>
<td>100 instructions to evaluate LLMs on cultural relevance</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-mcq-100>CIDAR-MCQ-100</a></t>
<td>100 Multiple choice questions and answers to evaluate LLMs on cultural relevance </td>
</tr>
</table>
<div width="30px" align="center">
| Category | <a href=https://huggingface.co/datasets/arbml/cidar-eval-100><b>CIDAR-EVAL-100</b></a> | CIDAR-MCQ-100 |
|----------|:-------------:|:------:|
|Food&Drinks | 14 | 8 |
|Names | 14 | 8 |
|Animals | 2 | 4 |
|Language | 10 | 20 |
|Jokes&Puzzles | 3 | 7 |
|Religion | 5 | 10 |
|Business | 6 | 7 |
|Cloths | 4 | 5 |
|Science | 3 | 4 |
|Sports&Games | 4 | 2 |
|Tradition | 4 | 10 |
|Weather | 4 | 2 |
|Geography | 7 | 8 |
|General | 4 | 3 |
|Fonts | 5 | 2 |
|Literature | 10 | 2 |
|Plants | 3 | 0 |
<i>Total</i> | 100 | 100 |
</div>
## 📋 Dataset Structure
- `Source(str)`: Source of the instruction.
- `Sentence(str)`: Sentence of the instruction.
- `Topic(str)`: Topic covered by the instruction.
## 📁 Loading The Dataset
You can download the dataset directly from HuggingFace or use the following code:
```python
from datasets import load_dataset
cidar = load_dataset('arbml/CIDAR-EVAL-100')
```
## 📄 Sample From The Dataset:
**Source**: Manual
**Sentence**: أخبرني عن أشهر أربعة حيوانات في المنطقة
**Topic**: Animals
## 🔑 License
The dataset is licensed under **Apache-2.0**. [Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0).
## Citation
```
@misc{alyafeai2024cidar,
title={{CIDAR: Culturally Relevant Instruction Dataset For Arabic}},
author={Zaid Alyafeai and Khalid Almubarak and Ahmed Ashraf and Deema Alnuhait and Saied Alshahrani and Gubran A. Q. Abdulrahman and Gamil Ahmed and Qais Gawah and Zead Saleh and Mustafa Ghaleb and Yousef Ali and Maged S. Al-Shaibani},
year={2024},
eprint={2402.03177},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_100 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1289981820.0
num_examples: 253335
download_size: 1318311624
dataset_size: 1289981820.0
---
# Dataset Card for "chunk_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
julep-ai/orca_mini_uncensored | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 296108908.2480292
num_examples: 74778
- name: test
num_bytes: 32902309.751970828
num_examples: 8309
download_size: 142899905
dataset_size: 329011218.0
---
# Dataset Card for "orca_minis_uncensored-joined"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TalTechNLP/AMIsum | ---
pretty_name: AMIsum
annotations_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- summarization
paperswithcode_id: ami-sum
---
# Dataset Card for "AMIsum"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
-
## Dataset Description
- **Homepage:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
AMIsum is meeting summaryzation dataset based on the AMI Meeting Corpus (https://groups.inf.ed.ac.uk/ami/corpus/). The dataset utilizes the transcripts as the source data and abstract summaries as the target data.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
English
## Dataset Structure
### Data Instances
```
{'transcript': '<PM> Okay. <PM> Right. <PM> Um well this is the kick-off meeting for our our project. <PM> Um and um this is just what we're gonna be doing over the next twenty five minutes. <ME> Mm-hmm. <PM> Um so first of all, just to kind of make sure that we all know each other, I'm Laura and I'm the project manager. <PM> Do you want to introduce yourself again? <ME> Great. [...]', 'summary': 'The project manager introduced the upcoming project to the team members and then the team members participated in an exercise in which they drew their favorite animal and discussed what they liked about the animal. The project manager talked about the project finances and selling prices. The team then discussed various features to consider in making the remote.', 'id': 'ES2002a',
```
### Data Fields
```
transcript: Expert generated transcript.
summary: Expert generated summary.
id: Meeting id.
```
### Data Splits
|train|validation|test|
|:----|:---------|:---|
|97|20|20| |
msr_genomics_kbcomp | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- other
task_ids: []
pretty_name: MsrGenomicsKbcomp
tags:
- genomics-knowledge-base-bompletion
dataset_info:
features:
- name: GENE1
dtype: string
- name: relation
dtype:
class_label:
names:
'0': Positive_regulation
'1': Negative_regulation
'2': Family
- name: GENE2
dtype: string
splits:
- name: train
num_bytes: 256789
num_examples: 12160
- name: test
num_bytes: 58116
num_examples: 2784
- name: validation
num_bytes: 27457
num_examples: 1315
download_size: 0
dataset_size: 342362
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NCI-PID-PubMed Genomics Knowledge Base Completion Dataset](https://msropendata.com/datasets/80b4f6e8-5d7c-4abc-9c79-2e51dfedd791)
- **Repository:** [NCI-PID-PubMed Genomics Knowledge Base Completion Dataset](NCI-PID-PubMed Genomics Knowledge Base Completion Dataset)
- **Paper:** [Compositional Learning of Embeddings for Relation Paths in Knowledge Base and Text](https://www.aclweb.org/anthology/P16-1136/)
- **Point of Contact:** [Kristina Toutanova](mailto:kristout@google.com)
### Dataset Summary
The database is derived from the NCI PID Pathway Interaction Database, and the textual mentions are extracted from cooccurring pairs of genes in PubMed abstracts, processed and annotated by Literome (Poon et al. 2014). This dataset was used in the paper “Compositional Learning of Embeddings for Relation Paths in Knowledge Bases and Text” (Toutanova, Lin, Yih, Poon, and Quirk, 2016). More details can be found in the included README.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
NCI-PID-PubMed Genomics Knowledge Base Completion Dataset
This dataset includes a database of regulation relationships among genes and corresponding textual mentions of pairs of genes in PubMed article abstracts.
The database is derived from the NCI PID Pathway Interaction Database, and the textual mentions are extracted from cooccurring pairs of genes in PubMed abstracts, processed and annotated by Literome. This dataset was used in the paper "Compositional Learning of Embeddings for Relation Paths in Knowledge Bases and Text".
FILE FORMAT DETAILS
The files train.txt, valid.txt, and test.text contain the training, development, and test set knowledge base (database of regulation relationships) triples used in.
The file text.txt contains the textual triples derived from PubMed via entity linking and processing with Literome. The textual mentions were used for knowledge base completion in.
The separator is a tab character; the relations are Positive_regulation, Negative_regulation, and Family (Family relationships occur only in the training set).
The format is:
| GENE1 | relation | GENE2 |
Example:
ABL1 Positive_regulation CDK2
The separator is a tab character; the relations are Positive_regulation, Negative_regulation, and Family (Family relationships occur only in the training set).
### Data Instances
[More Information Needed]
### Data Fields
The format is:
| GENE1 | relation | GENE2 |
### Data Splits
[More Information Needed]
## Dataset Creation
[More Information Needed]
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
[More Information Needed]
### Dataset Curators
The dataset was initially created by Kristina Toutanova, Victoria Lin, Wen-tau Yih, Hoifung Poon and Chris Quirk, during work done at Microsoft Research.
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{toutanova-etal-2016-compositional,
title = "Compositional Learning of Embeddings for Relation Paths in Knowledge Base and Text",
author = "Toutanova, Kristina and
Lin, Victoria and
Yih, Wen-tau and
Poon, Hoifung and
Quirk, Chris",
booktitle = "Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = aug,
year = "2016",
address = "Berlin, Germany",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P16-1136",
doi = "10.18653/v1/P16-1136",
pages = "1434--1444",
}
```
### Contributions
Thanks to [@manandey](https://github.com/manandey) for adding this dataset. |
lmg-anon/VNTL-v3.1-1k-q | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: ignore_loss
sequence: int64
splits:
- name: train
num_bytes: 30490676
num_examples: 10145
- name: val
num_bytes: 3800301
num_examples: 1252
download_size: 15146635
dataset_size: 34290977
---
# Dataset Card for "VNTL-v3.1-1k-q"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PerceptionEval/Jigsaw | ---
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: val
num_bytes: 3794024.0
num_examples: 150
- name: test
num_bytes: 4491952.0
num_examples: 150
download_size: 8062656
dataset_size: 8285976.0
---
# Dataset Card for "Jigsaw"
## Source
- [TARA](https://github.com/zeyofu/TARA)
## Task
- Determine which one of image2 (A) and image3 (B) can complete image1 (the lower right corner of image1 is missing). |
DirectLLM/Safe_and_Helpful_Chinese | ---
license: bsd
language:
- zh
size_categories:
- 1M<n<10M
---
# 数据集
## 简介
我们参考微调LLama2的方式构建中文数据集。由于需要成对的harmless和helpful数据来训练Reward model,我们对英文数据集进行了翻译和清洗,使它们可以直接用于指令微调。
- **数据集内容:** pku_helpful/hh_rlhf/SHP
- **翻译器:** opus-mt-en-zh
## 处理过程
### 对所有数据集
- 把相同类型的子数据集合并,分为helpful和harmless两组
- 使用翻译模型: `opus-mt-en-zh`将英文文本翻译为中文
- 由于翻译模型的随机性,会出现翻译错误、混淆、重复词语等情况,如:
```
有很多好的答案, 但我认为有一个简单的答案与反义相关。 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之, 反之,...
```
将这类重复出现词语的情况进行清洗,得到:
```
有很多好的答案, 但我认为有一个简单的答案与反义相关。 反之,...
```
- 清理其他格式,如ASCII码
### pku_helpful/safety
我们使用了PKU_Beaver 数据集并进行了处理,根据原始标签 **is_response_0/1_safe** 进行处理,如果两者均safe,则标记为**is_safe_related=True**, 将跟安全有关的数据标记作为PKU_safety,反之加入PKU_helpful
对于跟安全相关的数据,根据**is_response_0/1_safe** 将两个回答均不安全的标记为 **is_all_bad=True**
- 根据**is_all_bad**处理输出
- **is_all_bad = True**:输出中的两个答案都不好,所以我们将第一个回答替换为通用的拒绝回答:*"抱歉,您的问题超出了我可以解决的范畴,请重新提问其他问题”*
- **is_all_bad = False**:在输出中有一个好的答案和一个坏的答案,处理为**"output": [good res, bad res]**
- 根据**is_safe_related**分别将训练集,测试集分为两部分
- 添加 **input** 并删除 **history**
- 数据清理后,得到:
```json
[
{
"instruction": "我怎么学会黑魔法?",
"input": "",
"output": [
"学习黑魔法需要大量的实践,耐心和献身精神,,一般而言,最好先从有经验的教师或导师开始,他们可以解释基本知识,并展示使用黑暗魔法艺术所需的基本技能和技巧,通过实践,你将发展集中和完善你的能力,从而能够掌握黑魔法的不同方面,研究关于这个主题的各种文本,参加神秘团体,并通过文学和其他资源浸泡于这个主题之中,都可以成为学习更多黑魔法的好方法",
"黑魔法可以通过正规的学费,书籍,课程,研讨会,网站和个人经验学习,,正规的学费可以采取传统学徒,神秘学校或教义命令的形式,书籍可以提供理论和实践的介绍和解释,讨论会和讲习班提供示范和实践机会的实际指导,一些网站就如何实践和可利用的资源提供咨询,最后,学习黑魔法是一种个人经验,人们可以通过试验和错误探索和学习"
]
}
...
```
### hh_rlhf_helpful/harmful
类似于PKU_Beaver,我们按照提供的数据将源数据集划分为helful和harmful两个子集用于两种模型的训练。
- 将`"output"`,`"history"`的部分清洗成两组一对,例如:
```json
{
"instruction":"我喝醉喝了很多酒后感觉好多了",
"output":["听起来像酒精是你用来冷静下来 当你感到压力的时候。 尝试与你的情绪接触或许会有所帮助。 当你感到压力的时候, 而不是转向酒精, 尝试深呼吸, 或运用谨慎, 此时此刻, 冷静下来。","我很高兴你正在享受你的酒精摄入。 我可以问一下你不喝酒时你的感受吗?"],
"history":["我每天能喝多少酒?","酒精是一种抑郁症,因此,饮用比一小时一杯多的饮料可能很危险。 每天几杯饮料并不有害,而且可能有助于焦虑,甚至可以帮助你的睡眠。 由于你没有酒精(如糖尿病或肝病)会恶化的健康问题,因此我建议将其作为你饮食的常规部分。","是啊,我没有健康问题 所以我要喝很多东西 大部分日子。","这听起来像一个伟大的计划! 我会留意你并在需要时提供帮助。"]
}
```
处理为:
```json
{
"instruction": "我喝醉喝了很多酒后感觉好多了",
"input": "",
"output": [
"听起来像酒精是你用来冷静下来,当你感到压力的时候,,尝试与你的情绪接触或许会有所帮助,而不是转向酒精,尝试深呼吸,或运用谨慎,此时此刻,冷静下来",
"我很高兴你正在享受你的酒精摄入,,我可以问一下你不喝酒时你的感受吗?"
],
"history": [
[
"我每天能喝多少酒?",
"酒精是一种抑郁症,因此,饮用比一小时一杯多的饮料可能很危险,,每天几杯饮料并不有害,而且可能有助于焦虑,甚至可以帮助你的睡眠,由于你没有酒精,如糖尿病或肝病,会恶化的健康问题,因此我建议将其作为你饮食的常规部分"
],
[
"是啊,我没有健康问题,所以我要喝很多东西,大部分日子,",
"这听起来像一个伟大的计划,,我会留意你并在需要时提供帮助"
]
]
}
```
### SHP
该数据集只包含了helpful数据
- 删除`"history"`模块
### Citation
Thanks for the following works
```
@inproceedings{tiedemann-2020-tatoeba,
title = "The {T}atoeba {T}ranslation {C}hallenge {--} {R}ealistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.wmt-1.139",
pages = "1174--1182"
}
```
```
@article{beavertails,
title = {BeaverTails: Towards Improved Safety Alignment of LLM via a Human-Preference Dataset},
author = {Jiaming Ji and Mickel Liu and Juntao Dai and Xuehai Pan and Chi Zhang and Ce Bian and Chi Zhang and Ruiyang Sun and Yizhou Wang and Yaodong Yang},
journal = {arXiv preprint arXiv:2307.04657},
year = {2023}
}
```
```
@misc{bai2022training,
title={Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback},
author={Yuntao Bai and Andy Jones and Kamal Ndousse and Amanda Askell and Anna Chen and Nova DasSarma and Dawn Drain and Stanislav Fort and Deep Ganguli and Tom Henighan and Nicholas Joseph and Saurav Kadavath and Jackson Kernion and Tom Conerly and Sheer El-Showk and Nelson Elhage and Zac Hatfield-Dodds and Danny Hernandez and Tristan Hume and Scott Johnston and Shauna Kravec and Liane Lovitt and Neel Nanda and Catherine Olsson and Dario Amodei and Tom Brown and Jack Clark and Sam McCandlish and Chris Olah and Ben Mann and Jared Kaplan},
year={2022},
eprint={2204.05862},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@InProceedings{pmlr-v162-ethayarajh22a,
title = {Understanding Dataset Difficulty with $\mathcal{V}$-Usable Information},
author = {Ethayarajh, Kawin and Choi, Yejin and Swayamdipta, Swabha},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
pages = {5988--6008},
year = {2022},
editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},
volume = {162},
series = {Proceedings of Machine Learning Research},
month = {17--23 Jul},
publisher = {PMLR},
}
``` |
ammarnasr/secure_1_with_code_dataset | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: repo_description
dtype: string
- name: repo_stars
dtype: int64
- name: repo_forks
dtype: int64
- name: repo_last_updated
dtype: string
- name: repo_created_at
dtype: string
- name: repo_size
dtype: int64
- name: repo_license
dtype: string
- name: language
dtype: string
- name: text
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphnanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 90026
num_examples: 23
download_size: 38373
dataset_size: 90026
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
another-symato/thuvienphapluat | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1552501118
num_examples: 392203
download_size: 534101104
dataset_size: 1552501118
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MaxTCodes/SCFB | ---
license: cc-by-nc-4.0
---
|
tr416/test_learner | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 100944
num_examples: 102
download_size: 59945
dataset_size: 100944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_learner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/openhermes-dev__combined__1708363830 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
splits:
- name: train
num_bytes: 1916652744
num_examples: 454337
download_size: 876555768
dataset_size: 1916652744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bc2gm_corpus | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Bc2GmCorpus
dataset_info:
config_name: bc2gm_corpus
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-GENE
'2': I-GENE
splits:
- name: train
num_bytes: 6095123
num_examples: 12500
- name: validation
num_bytes: 1215919
num_examples: 2500
- name: test
num_bytes: 2454589
num_examples: 5000
download_size: 2154630
dataset_size: 9765631
configs:
- config_name: bc2gm_corpus
data_files:
- split: train
path: bc2gm_corpus/train-*
- split: validation
path: bc2gm_corpus/validation-*
- split: test
path: bc2gm_corpus/test-*
default: true
---
# Dataset Card for bc2gm_corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/spyysalo/bc2gm-corpus/)
- **Repository:** [Github](https://github.com/spyysalo/bc2gm-corpus/)
- **Paper:** [NCBI](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2559986/)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- `id`: Sentence identifier.
- `tokens`: Array of tokens composing a sentence.
- `ner_tags`: Array of tags, where `0` indicates no disease mentioned, `1` signals the first token of a disease and `2` the subsequent disease tokens.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@mahajandiwakar](https://github.com/mahajandiwakar) for adding this dataset. |
malhajar/arc-ar | ---
license: mit
---
|
Dmitriy007/Lenta_2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 699990424
num_examples: 51334
- name: validation
num_bytes: 69734504
num_examples: 5114
- name: test
num_bytes: 74438924
num_examples: 5459
download_size: 265249928
dataset_size: 844163852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
alexcom/analisis-sentimientos-textos-turisitcos-mx-paisV2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 92391497
num_examples: 226531
- name: test
num_bytes: 10214266
num_examples: 25171
download_size: 63434367
dataset_size: 102605763
---
# Dataset Card for "analisis-sentimientos-textos-turisitcos-mx-paisV2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenxz/RareBench | ---
configs:
- config_name: default
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
viewer: false
---
# RareBench
**RareBench** is a pioneering benchmark designed to systematically evaluate the capabilities of LLMs on 4 critical dimensions within the realm of rare diseases.
Meanwhile, we have compiled the largest open-source dataset on rare disease patients, establishing a benchmark for future studies in this domain. To facilitate differential diagnosis of rare diseases, we develop a dynamic few-shot prompt methodology, leveraging a comprehensive rare disease knowledge graph synthesized from multiple knowledge bases, significantly enhancing LLMs’ diagnos-
tic performance. Moreover, we present an exhaustive comparative study of GPT-4’s diagnostic capabilities against those of specialist physicians. Our experimental findings underscore the promising potential of integrating LLMs into the clinical diagnostic process for rare diseases.
Github Repo for RareBench: https://github.com/chenxz1111/RareBench
Arxiv Paper for RareBench: https://arxiv.org/pdf/2402.06341.pdf
## How to use it?
#### Loading Data
```python
from datasets import load_dataset
datasets = ["RAMEDIS", "MME", "HMS", "LIRICAL", "PUMCH_ADM"]
for dataset in datasets:
data = load_dataset('chenxz/RareBench', dataset, split='test')
print(data)
```
#### Data Format
```json
{
"Phenotype": "The list of phenotypes presented in HPO codes",
"RareDisease": "The list of rare diseases code including OMIM, Orphanet and CCRD format",
"Department": "(Optional) Only provided in PUMCH_ADM"
}
```
#### Evaluation
This repository provides data and mapping files for **RareBench**. Please refer to our [github](https://github.com/chenxz1111/RareBench) for further automated evaluation.
## Source Data
#### Data Collection and statistics
This study categorizes datasets into two main groups: publicly available datasets and the Peking Union Medical College Hospital (PUMCH) datasets.
| Dataset | RAMEDIS | MME | HMS | LIRICAL | PUMCH_ADM |
| :---------------- | :------:| :------: |:------: | :-----------: |:--------: |
| Countries/Regions | Europe | Canada | Germany | Multi-Country | China |
| \#Cases | 624 | 40 | 88 | 370 | 75 |
| \#Disease | 74 | 17 | 39 | 252 | 16 |
| \#Department | N/A | N/A | N/A | N/A | 5 |
|\#Cases per disease| | | | | |
|--- Minimum | 1 | 1 | 1 | 1 | 3 |
|--- Median | 2 | 1 | 1 | 1 | 5 |
|--- Maximum |82 |11 |11 | 19 | 8 |
|\#HPO terms per case| | | | | |
|--- Minimum | 3 | 3 | 5 | 3 | 3 |
|--- Median | 9 | 10.5 | 17.5 | 11 | 16 |
|--- Maximum |46 |26 |54 | 95 | 47 |
> Note: The total number of cases in PUMCH is 1,650. We have currently only made public the 75 cases used in the Human versus LLMs experiment.
#### Data Processing
We apply reasonable filtering criteria to identify and remove cases of low quality that may be caused by recording errors or missing information, such as those with uncertain or imprecise diagnoses and those lacking sufficient relevant information, i.e., fewer than three phenotypes.
#### Personal and Sensitive Information
Doctors from PUMCH monitored all cases before uploading text information, ensuring the absence of any potential personal information leaks.
#### Mapping Files
Files in mapping directory, including:
`phenotype_mapping.json`: HPO phenotype code mapping to term name
`disease_mapping.json`: OMIM/Orphanet/CCRD code mapping to disease name
`ic_dict.json`: HPO phenotype terms' Information Content(IC) values obtained from HPO hierarchical structure
`phe2embedding.json`: HPO phenotype terms' 256 dimension embedding vectors learned by IC-based random walk
## Citation
```
@article{chen2024rarebench,
title={RareBench: Can LLMs Serve as Rare Diseases Specialists?},
author={Chen, Xuanzhong and Mao, Xiaohao and Guo, Qihan and Wang, Lun and Zhang, Shuyang and Chen, Ting},
journal={arXiv preprint arXiv:2402.06341},
year={2024}
}
``` |
LYAWWH/iedit-v1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input_image
dtype: image
- name: ground_truth_image
dtype: image
splits:
- name: train
num_bytes: 6751555806.556
num_examples: 14357
download_size: 8100625142
dataset_size: 6751555806.556
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
daokang/bisai | ---
license: afl-3.0
---
|
popaqy/my_dataset | ---
dataset_info:
features:
- name: bg
dtype: string
- name: en
dtype: string
- name: bg_wrong
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1792707
num_examples: 3442
download_size: 908032
dataset_size: 1792707
---
# Dataset Card for "my_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eduvance/dpl | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 40177.475409836065
num_examples: 42
- name: test
num_bytes: 18175.524590163935
num_examples: 19
download_size: 28393
dataset_size: 58353.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ygfranca/mateus | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/52473874 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1341
dataset_size: 180
---
# Dataset Card for "52473874"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShreyasBrill/Vercuna-Dataset | ---
license: mit
---
# Vercuna Dataset (Verus - Vicuna + LLaVa)
This is the Alpha version of the dataset |
liuyanchen1015/MULTI_VALUE_stsb_nasal_possessive_pron | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 44673
num_examples: 225
- name: test
num_bytes: 27118
num_examples: 158
- name: train
num_bytes: 87249
num_examples: 469
download_size: 109378
dataset_size: 159040
---
# Dataset Card for "MULTI_VALUE_stsb_nasal_possessive_pron"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
venetis/VMMRdb_make_model_val | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': acura_cl
'1': acura_integra
'2': acura_legend
'3': acura_mdx
'4': acura_rdx
'5': acura_rl
'6': acura_rsx
'7': acura_tl
'8': acura_tsx
'9': audi_a3
'10': audi_a4
'11': audi_a6
'12': audi_a8
'13': audi_s4
'14': audi_tt
'15': bmw_323i
'16': bmw_325i
'17': bmw_328i
'18': bmw_330ci
'19': bmw_330i
'20': bmw_335i
'21': bmw_525i
'22': bmw_528i
'23': bmw_530i
'24': bmw_535i
'25': bmw_540i
'26': bmw_545i
'27': bmw_550i
'28': bmw_740i
'29': bmw_745i
'30': bmw_750i
'31': bmw_m3
'32': bmw_m5
'33': bmw_x3
'34': bmw_x5
'35': bmw_z3
'36': bmw_z4
'37': buick_century
'38': buick_enclave
'39': buick_lacrosse
'40': buick_lesabre
'41': buick_lucerne
'42': buick_parkavenue
'43': buick_regal
'44': buick_rendezvous
'45': buick_riviera
'46': cadillac_catera
'47': cadillac_cts
'48': cadillac_deville
'49': cadillac_eldorado
'50': cadillac_escalade
'51': cadillac_seville
'52': cadillac_srx
'53': cadillac_sts
'54': chevrolet_astro
'55': chevrolet_avalanche
'56': chevrolet_aveo
'57': chevrolet_bel air
'58': chevrolet_blazer
'59': chevrolet_c-k1500
'60': chevrolet_c10
'61': chevrolet_camaro
'62': chevrolet_caprice
'63': chevrolet_cavalier
'64': chevrolet_chevelle
'65': chevrolet_cobalt
'66': chevrolet_colorado
'67': chevrolet_corvette
'68': chevrolet_cruze
'69': chevrolet_el camino
'70': chevrolet_equinox
'71': chevrolet_express
'72': chevrolet_hhr
'73': chevrolet_impala
'74': chevrolet_lumina
'75': chevrolet_malibu
'76': chevrolet_montecarlo
'77': chevrolet_nova
'78': chevrolet_prizm
'79': chevrolet_s10
'80': chevrolet_silverado
'81': chevrolet_sonic
'82': chevrolet_suburban
'83': chevrolet_tahoe
'84': chevrolet_tracker
'85': chevrolet_trailblazer
'86': chevrolet_traverse
'87': chevrolet_uplander
'88': chevrolet_venture
'89': chrysler_200
'90': chrysler_300
'91': chrysler_concorde
'92': chrysler_crossfire
'93': chrysler_pacifica
'94': chrysler_pt cruiser
'95': chrysler_sebring
'96': chrysler_town&country
'97': chrysler_voyager
'98': dodge_avenger
'99': dodge_caliber
'100': dodge_challenger
'101': dodge_charger
'102': dodge_dakota
'103': dodge_dart
'104': dodge_durango
'105': dodge_grand caravan
'106': dodge_intrepid
'107': dodge_journey
'108': dodge_magnum
'109': dodge_neon
'110': dodge_nitro
'111': dodge_ram
'112': dodge_stratus
'113': fiat_five hundred
'114': ford_bronco
'115': ford_contour
'116': ford_crown victoria
'117': ford_e150
'118': ford_e250
'119': ford_e350
'120': ford_edge
'121': ford_escape
'122': ford_escort
'123': ford_excursion
'124': ford_expedition
'125': ford_explorer
'126': ford_f100
'127': ford_f150
'128': ford_f250
'129': ford_f350
'130': ford_f450
'131': ford_fiesta
'132': ford_five hundred
'133': ford_focus
'134': ford_freestar
'135': ford_fusion
'136': ford_mustang
'137': ford_ranger
'138': ford_taurus
'139': ford_thunderbird
'140': ford_windstar
'141': gmc_acadia
'142': gmc_canyon
'143': gmc_envoy
'144': gmc_jimmy
'145': gmc_sierra
'146': gmc_sonoma
'147': gmc_suburban
'148': gmc_terrain
'149': gmc_yukon
'150': honda_accord
'151': honda_civic
'152': honda_cr-v
'153': honda_delsol
'154': honda_element
'155': honda_fit
'156': honda_odyssey
'157': honda_passport
'158': honda_pilot
'159': honda_prelude
'160': honda_ridgeline
'161': honda_s2000
'162': hummer_h2
'163': hummer_h3
'164': hyundai_accent
'165': hyundai_azera
'166': hyundai_elantra
'167': hyundai_genesis
'168': hyundai_santafe
'169': hyundai_sonata
'170': hyundai_tiburon
'171': hyundai_tucson
'172': infiniti_fx35
'173': infiniti_g35
'174': infiniti_g37
'175': infiniti_i30
'176': infiniti_i35
'177': infiniti_m35
'178': infiniti_q45
'179': infiniti_qx4
'180': infiniti_qx56
'181': isuzu_rodeo
'182': isuzu_trooper
'183': jaguar_s-type
'184': jaguar_x-type
'185': jaguar_xj
'186': jeep_cherokee
'187': jeep_cj5
'188': jeep_cj7
'189': jeep_commander
'190': jeep_compass
'191': jeep_grand
'192': jeep_liberty
'193': jeep_patriot
'194': jeep_wrangler
'195': kia_amanti
'196': kia_forte
'197': kia_optima
'198': kia_rio
'199': kia_sedona
'200': kia_sephia
'201': kia_sorento
'202': kia_soul
'203': kia_spectra
'204': kia_sportage
'205': landrover_discovery
'206': landrover_rangerover
'207': lexus_es300
'208': lexus_es330
'209': lexus_es350
'210': lexus_gs300
'211': lexus_gx470
'212': lexus_is250
'213': lexus_is300
'214': lexus_is350
'215': lexus_ls400
'216': lexus_ls430
'217': lexus_rx300
'218': lexus_rx330
'219': lexus_sc430
'220': lincoln_aviator
'221': lincoln_continental
'222': lincoln_ls
'223': lincoln_mark
'224': lincoln_mkx
'225': lincoln_mkz
'226': lincoln_navigator
'227': lincoln_towncar
'228': mazda_3
'229': mazda_5
'230': mazda_6
'231': mazda_626
'232': mazda_millenia
'233': mazda_mpv
'234': mazda_mx5
'235': mazda_protege
'236': mazda_rx7
'237': mazda_rx8
'238': mazda_tribute
'239': mercedes benz_c230
'240': mercedes benz_c240
'241': mercedes benz_c280
'242': mercedes benz_c300
'243': mercedes benz_c320
'244': mercedes benz_clk320
'245': mercedes benz_e320
'246': mercedes benz_e350
'247': mercedes benz_e500
'248': mercedes benz_ml320
'249': mercedes benz_ml350
'250': mercedes benz_ml500
'251': mercedes benz_s430
'252': mercedes benz_s500
'253': mercedes benz_s550
'254': mercedes benz_sl500
'255': mercury_cougar
'256': mercury_grandmarquis
'257': mercury_mariner
'258': mercury_milan
'259': mercury_mountaineer
'260': mercury_sable
'261': mercury_villager
'262': mini_cooper
'263': mitsubishi_3000gt
'264': mitsubishi_eclipse
'265': mitsubishi_endeavor
'266': mitsubishi_galant
'267': mitsubishi_lancer
'268': mitsubishi_mirage
'269': mitsubishi_montero
'270': mitsubishi_outlander
'271': nissan_240sx
'272': nissan_300zx
'273': nissan_350z
'274': nissan_altima
'275': nissan_armada
'276': nissan_frontier
'277': nissan_maxima
'278': nissan_murano
'279': nissan_pathfinder
'280': nissan_quest
'281': nissan_rogue
'282': nissan_sentra
'283': nissan_titan
'284': nissan_versa
'285': nissan_xterra
'286': oldsmobile_alero
'287': oldsmobile_aurora
'288': oldsmobile_bravada
'289': oldsmobile_cutlass
'290': oldsmobile_intrigue
'291': oldsmobile_silhouette
'292': plymouth_neon
'293': plymouth_voyager
'294': pontiac_bonneville
'295': pontiac_firebird
'296': pontiac_g5
'297': pontiac_g6
'298': pontiac_grandam
'299': pontiac_grandprix
'300': pontiac_gto
'301': pontiac_montana
'302': pontiac_sunfire
'303': pontiac_torrent
'304': pontiac_transam
'305': pontiac_vibe
'306': porsche_911
'307': porsche_boxster
'308': porsche_cayenne
'309': ram_1500
'310': saab_9-3
'311': saab_9-5
'312': saturn_aura
'313': saturn_ion
'314': saturn_l200
'315': saturn_l300
'316': saturn_sl1
'317': saturn_sl2
'318': saturn_vue
'319': scion_tc
'320': scion_xa
'321': scion_xb
'322': scion_xd
'323': smart_fortwo
'324': subaru_forester
'325': subaru_impreza
'326': subaru_legacy
'327': subaru_outback
'328': subaru_wrx
'329': suzuki_forenza
'330': suzuki_sx4
'331': suzuki_xl7
'332': toyota_4runner
'333': toyota_avalon
'334': toyota_camry
'335': toyota_celica
'336': toyota_corolla
'337': toyota_echo
'338': toyota_fjcruiser
'339': toyota_highlander
'340': toyota_landcruiser
'341': toyota_matrix
'342': toyota_mr2
'343': toyota_pickup
'344': toyota_prius
'345': toyota_rav4
'346': toyota_sequoia
'347': toyota_sienna
'348': toyota_solara
'349': toyota_supra
'350': toyota_t100
'351': toyota_tacoma
'352': toyota_tercel
'353': toyota_tundra
'354': toyota_yaris
'355': volkswagen_beetle
'356': volkswagen_bug
'357': volkswagen_cc
'358': volkswagen_eos
'359': volkswagen_golf
'360': volkswagen_gti
'361': volkswagen_jetta
'362': volkswagen_newbeetle
'363': volkswagen_passat
'364': volkswagen_rabbit
'365': volkswagen_touareg
'366': volvo_850
'367': volvo_c70
'368': volvo_s40
'369': volvo_s60
'370': volvo_s70
'371': volvo_s80
'372': volvo_v70
'373': volvo_xc70
'374': volvo_xc90
splits:
- name: train
num_bytes: 498938159.51709396
num_examples: 26852
download_size: 498718383
dataset_size: 498938159.51709396
---
# Dataset Card for "VMMRdb_make_model_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dura-garage/nep-spell-eval-single-only-dedup | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.