id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
dadib/sample_pdf | dadib | 2023-11-24T13:10:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T13:10:16Z | 2023-11-24T12:52:33.000Z | 2023-11-24T12:52:33 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mandania/i-am-that | mandania | 2023-11-24T12:55:29Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T12:55:29Z | 2023-11-24T12:55:22.000Z | 2023-11-24T12:55:22 | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 895873
num_examples: 88
- name: test
num_bytes: 143773
num_examples: 16
download_size: 582588
dataset_size: 1039646
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mandania/i-am-that-split | mandania | 2023-11-24T12:57:18Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T12:57:18Z | 2023-11-24T12:57:11.000Z | 2023-11-24T12:57:11 | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 708164
num_examples: 1698
- name: test
num_bytes: 125231
num_examples: 300
download_size: 485640
dataset_size: 833395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tollefj/NORTS | tollefj | 2023-11-24T15:55:48Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T15:55:48Z | 2023-11-24T13:16:41.000Z | 2023-11-24T13:16:41 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: AssignmentId
dtype: string
- name: docId
dtype: string
- name: article
dtype: string
- name: tid1
dtype: int64
- name: tid2
dtype: int64
- name: words1
dtype: string
- name: words2
dtype: string
- name: phrases1
dtype: string
- name: phrases2
dtype: string
- name: sentences1
dtype: string
- name: sentences2
dtype: string
- name: summary1
dtype: string
- name: summary2
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11384802
num_examples: 2400
- name: test
num_bytes: 2979312
num_examples: 600
download_size: 7539242
dataset_size: 14364114
---
# NORTS - Norwegian Topic-based Summarization Dataset
Translated from NORTS (NEWs Topic-based Summarization Dataset, https://github.com/ali-bahrainian/NEWTS) using the 1.3B NLLB model (https://huggingface.co/facebook/nllb-200-distilled-1.3B)
| [
-0.31715840101242065,
-0.6653602123260498,
0.19634822010993958,
0.15342789888381958,
-0.6731371879577637,
-0.2622242569923401,
0.14489099383354187,
-0.038612354546785355,
0.7231603264808655,
0.9585690498352051,
-0.7627453804016113,
-0.9278209805488586,
-0.42371994256973267,
0.0470813252031... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Alex-Song/Test | Alex-Song | 2023-11-25T09:38:35Z | 0 | 0 | null | [
"task_categories:translation",
"size_categories:1K<n<10K",
"language:ja",
"language:zh",
"language:ar",
"license:apache-2.0",
"music",
"region:us"
] | 2023-11-25T09:38:35Z | 2023-11-24T13:17:27.000Z | 2023-11-24T13:17:27 | ---
license: apache-2.0
task_categories:
- translation
language:
- ja
- zh
- ar
tags:
- music
pretty_name: MTSpeech
size_categories:
- 1K<n<10K
extra_gated_prompt: "You agree to not attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Name: text
Affiliation: text
Email: text
I agree to not attempt to determine the identity of speakers in this dataset: checkbox
viewer: false
---
# 这是一个很牛逼的数据集
| [
-0.26123639941215515,
-0.8457940220832825,
0.08052433282136917,
0.9108937382698059,
-0.9176377058029175,
0.03428245335817337,
0.3532896041870117,
0.022798899561166763,
0.8705301284790039,
0.9324696660041809,
-0.29722580313682556,
-0.5065327882766724,
-0.8883576989173889,
0.3973142206668854... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vibhorag101/suicide_prediction_dataset_phr | vibhorag101 | 2023-11-25T03:52:20Z | 0 | 0 | null | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"region:us"
] | 2023-11-25T03:52:20Z | 2023-11-24T13:27:36.000Z | 2023-11-24T13:27:36 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 75975910.63587219
num_examples: 185574
- name: test
num_bytes: 18994182.36412781
num_examples: 46394
download_size: 53587175
dataset_size: 94970093
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: Suicidal Tendency Prediction Dataset
size_categories:
- 100K<n<1M
---
# Dataset Card for "vibhorag101/suicide_prediction_dataset_phr"
- The dataset is sourced from Reddit and is available on [Kaggle](https://www.kaggle.com/datasets/nikhileswarkomati/suicide-watch).
- The dataset contains text with binary labels for suicide or non-suicide.
- The dataset was cleaned and following steps were applied
- Converted to lowercase
- Removed numbers and special characters.
- Removed URLs, Emojis and accented characters.
- Removed any word contractions.
- Remove any extra white spaces and any extra spaces after a single space.
- Removed any consecutive characters repeated more than 3 times.
- Tokenised the text, then lemmatized it and then removed the stopwords (excluding not).
- The `class_label` column was renamed to `label` for use with trainer API.
- The evaluation set had ~23000 samples, while the training set had ~186k samples, i.e. a 80:10:10 (train:test:val) split. | [
-0.28589773178100586,
-0.5444280505180359,
0.1536824107170105,
0.23606759309768677,
-0.3563520014286041,
0.14987239241600037,
-0.1589203178882599,
-0.17341378331184387,
0.16540087759494781,
0.22506016492843628,
-0.9540337324142456,
-0.6982116103172302,
-0.6988639831542969,
0.22481791675090... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Coroseven/AliceZuberg | Coroseven | 2023-11-24T13:33:20Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T13:33:20Z | 2023-11-24T13:28:53.000Z | 2023-11-24T13:28:53 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Zerenidel/Karakter_Anime_SMA | Zerenidel | 2023-11-24T15:44:58Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T15:44:58Z | 2023-11-24T14:05:01.000Z | 2023-11-24T14:05:01 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hbilgen/sap-notes | hbilgen | 2023-11-24T14:21:22Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T14:21:22Z | 2023-11-24T14:21:22.000Z | 2023-11-24T14:21:22 | ---
license: unknown
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hbilgen/sap-help | hbilgen | 2023-11-24T14:21:35Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T14:21:35Z | 2023-11-24T14:21:35.000Z | 2023-11-24T14:21:35 | ---
license: unknown
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hbilgen/sap-basis | hbilgen | 2023-11-24T14:21:47Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T14:21:47Z | 2023-11-24T14:21:47.000Z | 2023-11-24T14:21:47 | ---
license: unknown
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hbilgen/sap-community | hbilgen | 2023-11-24T14:22:17Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T14:22:17Z | 2023-11-24T14:22:15.000Z | 2023-11-24T14:22:15 | ---
license: unknown
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Lununlu/Applio-RVC-Fork | Lununlu | 2023-11-24T14:31:54Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T14:31:54Z | 2023-11-24T14:27:43.000Z | 2023-11-24T14:27:43 | ---
license: apache-2.0
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Devdeshitha/new-fine-tuning-mistral7b | Devdeshitha | 2023-11-24T14:32:27Z | 0 | 1 | null | [
"region:us"
] | 2023-11-24T14:32:27Z | 2023-11-24T14:31:01.000Z | 2023-11-24T14:31:01 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
CCCarloooo/eval-llava | CCCarloooo | 2023-11-24T14:38:51Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T14:38:51Z | 2023-11-24T14:36:17.000Z | 2023-11-24T14:36:17 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
junglepy/imgs_metro | junglepy | 2023-11-24T14:36:34Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T14:36:34Z | 2023-11-24T14:36:34.000Z | 2023-11-24T14:36:34 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
manu/bnf_clean | manu | 2023-11-24T15:16:04Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T15:16:04Z | 2023-11-24T14:38:15.000Z | 2023-11-24T14:38:15 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: author
dtype: string
- name: title
dtype: string
- name: mean_nqa
dtype: float64
- name: date
dtype: string
- name: subject
dtype: string
- name: rights
dtype: string
- name: original_folder
dtype: string
- name: perplexity
dtype: float64
splits:
- name: '2023'
num_bytes: 129088433.72207084
num_examples: 441
- name: '2021_1'
num_bytes: 96451.66666666667
num_examples: 5
- name: '2021_2'
num_bytes: 85416.8
num_examples: 4
download_size: 77863123
dataset_size: 129270302.18873751
---
# Dataset Card for "bnf_clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6670714616775513,
-0.34226056933403015,
-0.03091275505721569,
0.08318306505680084,
-0.4013773202896118,
-0.10246186703443527,
0.3434034287929535,
-0.2529584765434265,
0.7355036735534668,
0.7711001634597778,
-0.812732458114624,
-0.8532160520553589,
-0.3682066798210144,
-0.025307565927505... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Youssef115/Lincon | Youssef115 | 2023-11-24T14:49:47Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T14:49:47Z | 2023-11-24T14:49:47.000Z | 2023-11-24T14:49:47 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vinny9812/audiobrunogiglio | vinny9812 | 2023-11-24T19:36:11Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T19:36:11Z | 2023-11-24T14:52:16.000Z | 2023-11-24T14:52:16 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
WuJianping/test-dataset-abc | WuJianping | 2023-11-24T15:16:18Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T15:16:18Z | 2023-11-24T15:16:17.000Z | 2023-11-24T15:16:17 | ---
license: apache-2.0
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/laffey_azurlane | AppleHarem | 2023-11-24T15:24:37Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T15:24:37Z | 2023-11-24T15:24:24.000Z | 2023-11-24T15:24:24 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of laffey (Azur Lane)
This is the dataset of laffey (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 516 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 581 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 516 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 516 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 227 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 581 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 581 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.5823741555213928,
-0.3986625373363495,
0.6063550710678101,
0.09394542127847672,
-0.08231361955404282,
-0.14676927030086517,
0.27251267433166504,
-0.6061050891876221,
0.7602015137672424,
0.7946793437004089,
-0.8630067110061646,
-0.8019342422485352,
-0.4888274073600769,
0.2394659072160720... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lucasabc/vozes | lucasabc | 2023-11-24T15:25:39Z | 0 | 0 | null | [
"license:other",
"region:us"
] | 2023-11-24T15:25:39Z | 2023-11-24T15:24:53.000Z | 2023-11-24T15:24:53 | ---
license: other
license_name: teste
license_link: LICENSE
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
philipphager/baidu-ultr_baidu-base-12L | philipphager | 2023-11-28T13:23:56Z | 0 | 0 | null | [
"license:cc-by-nc-4.0",
"region:us"
] | 2023-11-28T13:23:56Z | 2023-11-24T15:37:16.000Z | 2023-11-24T15:37:16 | ---
license: cc-by-nc-4.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
philipphager/baidu-ultr_tencent-bert-12L | philipphager | 2023-11-28T13:23:15Z | 0 | 0 | null | [
"license:cc-by-nc-4.0",
"region:us"
] | 2023-11-28T13:23:15Z | 2023-11-24T15:37:43.000Z | 2023-11-24T15:37:43 | ---
license: cc-by-nc-4.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Baldezo313/rsna-pneumonia-dataset | Baldezo313 | 2023-11-25T16:52:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T16:52:14Z | 2023-11-24T15:40:03.000Z | 2023-11-24T15:40:03 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cyanelis/4349 | cyanelis | 2023-11-25T22:30:19Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:30:19Z | 2023-11-24T15:40:35.000Z | 2023-11-24T15:40:35 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
projecte-aina/CA-PT_Parallel_Corpus | projecte-aina | 2023-11-27T14:57:37Z | 0 | 0 | null | [
"task_categories:translation",
"multilinguality:translation",
"size_categories:1M<n<10M",
"source_datasets:original",
"language:ca",
"language:pt",
"language:multilingual",
"region:us"
] | 2023-11-27T14:57:37Z | 2023-11-24T15:40:36.000Z | 2023-11-24T15:40:36 | ---
language:
- ca
- pt
- multilingual
multilinguality:
- translation
pretty_name: CA-PT Parallel Corpus
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
task_ids: []
---
# Dataset Card for CA-PT Parallel Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Data preparation](#data-preparation)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Author](#author)
- [Contact Information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licenciung-informatrion)
- [Funding](#funding)
## Dataset Description
### Dataset Summary
The CA-PT Parallel Corpus is a Catalan-Portuguese dataset of **9.892.953** parallel sentences. The dataset was created to support Catalan NLP tasks, e.g.,
Machine Translation.
### Supported Tasks and Leaderboards
The dataset can be used to train a model for Multilingual Machine Translation. Success on this task is typically measured by achieving a high BLEU score.
### Languages
The texts in the dataset are in Catalan and Portuguese.
## Dataset Structure
Two separated txt files are provided with the sentences sorted in the same order:
- ca-pt_2023_09_01_full.ca: contains 9.892.953 Catalan sentences.
- ca-pt_2023_09_01_full.pt: contains 9.892.953 Portuguese sentences.
### Data Splits
The dataset contains a single split: `train`.
## Dataset Creation
### Source Data
The dataset is a combination of the following authentic datasets:
| Dataset | Sentences |
|:-------|-------:|
| CCMatrix v1 | 3.765.459 |
| WikiMatrix | 317.649 |
| GNOME | 1.752 |
| KDE4 | 117.828 |
| QED | 43.736 |
| TED2020 v1 | 41.461 |
| OpenSubtitles | 235.604 |
| GlobalVoices | 3.430 |
| Tatoeba | 723 |
| Europarl | 3.765.459 |
| **Total** | **6.159.631** |
All corpora except Europarl were collected from [Opus](https://opus.nlpl.eu/).
The Europarl corpus is a synthetic parallel corpus created from the original Spanish-Catalan corpus by [SoftCatalà](https://github.com/Softcatala/Europarl-catalan).
The remaining **3.733.322** sentences are synthetic parallel data created from a random sampling of the Spanish-Portuguese corpora available on [Opus](https://opus.nlpl.eu/) and translated into Catalan using the [PlanTL es-ca](https://huggingface.co/PlanTL-GOB-ES/mt-plantl-es-ca) model.
### Data preparation
All datasets are deduplicated and filtered to remove any sentence pairs with a cosine similarity of less than 0.75.
This is done using sentence embeddings calculated using [LaBSE](https://huggingface.co/sentence-transformers/LaBSE).
The filtered datasets are then concatenated to form a final corpus of **9.892.953** parallel sentences and before training the punctuation is normalized using a modified version of the join-single-file.py script from [SoftCatalà](https://github.com/Softcatala/nmt-models/blob/master/data-processing-tools/join-single-file.py).
### Personal and Sensitive Information
No anonymisation process was performed.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to help develop Machines Translation tasks for low-resource languages such as Catalan.
### Discussion of Biases
We are aware that since part of the data comes from unreliable web pages and non-curated texts, some biases may be present in the dataset.
Nonetheless, we have not applied any steps to reduce their impact.
### Other Known Limitations
The dataset contains data of a general domain. Application of this dataset in more specific domains such as biomedical, legal etc. would be of limited use.
## Additional Information
### Author
Language Technologies Unit (LangTech) at the Barcelona Supercomputing Center.
### Contact information
For further information, please send an email to langtech@bsc.es.
### Copyright
Copyright Language Technologies Unit at Barcelona Supercomputing Center (2023).
### Licensing information
This work is licensed under a [Attribution-NonCommercial-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-nc-sa/4.0/).
### Funding
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/index.html#googtrans(ca|en) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina). | [
-0.3180238604545593,
-0.6326935291290283,
0.2574456036090851,
0.6669911742210388,
-0.26021867990493774,
0.07734744995832443,
-0.5160090923309326,
-0.2699540853500366,
0.610632598400116,
0.5141462087631226,
-0.42508357763290405,
-0.9269490242004395,
-0.7438575625419617,
0.43041089177131653,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
osyvokon/ua_gec_instruction_tuning | osyvokon | 2023-11-24T16:50:06Z | 0 | 0 | null | [
"size_categories:10K<n<100K",
"language:uk",
"license:cc-by-4.0",
"region:us"
] | 2023-11-24T16:50:06Z | 2023-11-24T16:14:18.000Z | 2023-11-24T16:14:18 | ---
license: cc-by-4.0
language:
- uk
size_categories:
- 10K<n<100K
---
# UA-GEC instruction tuning
This dataset contains prompts and expected outputs for the grammatical error
correction task in the Ukrainian language. It is based on the
CC-BY-4.0-licensed [UA-GEC](https://github.com/grammarly/ua-gec) dataset. The
license of the original data is CC-BY-4.0.
This dataset contains 1,700 examples of fixing errors in long documents, and
~28,000 sentence-level examples.
The instructions ask to correct errors in the text. Sometimes the model outputs
the corrected text as is. At other times, it will add "Sure, here's the
corrected text". If the text doesn't contain any errors, sometimes the model
will just output the input text, and in other cases it will write "This text
doesn't contain grammatical errors."
You can find the exact input and output templates can be found in
`input_templates.doc.dat`, `input_templates.sent.dat` and `generate.py`.
## Stats
Metric | Value
-------|-------
Number of document-level examples | 1,700
Number of sentence-level examples | 28,258
Number of input templates | 14
Number of output templates | 6
| [
-0.0359099917113781,
-0.7654066681861877,
0.6033446192741394,
0.1421121209859848,
-0.07940101623535156,
-0.2441505789756775,
-0.18791548907756805,
0.10580084472894669,
-0.12665723264217377,
0.22707819938659668,
-0.9278740286827087,
-1.0059099197387695,
-0.33493566513061523,
0.3537746965885... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
f0xn0v4/RedHatOpenshiftDocs | f0xn0v4 | 2023-11-24T16:20:23Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T16:20:23Z | 2023-11-24T16:18:15.000Z | 2023-11-24T16:18:15 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
confit/mswc | confit | 2023-11-24T17:05:48Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T17:05:48Z | 2023-11-24T16:33:13.000Z | 2023-11-24T16:33:13 | ---
configs:
- config_name: eng
data_files:
- split: train
path: eng/train-*
- split: validation
path: eng/validation-*
- split: test
path: eng/test-*
- config_name: ind
data_files:
- split: train
path: ind/train-*
- split: validation
path: ind/validation-*
- split: test
path: ind/test-*
- config_name: spa
data_files:
- split: train
path: spa/train-*
- split: validation
path: spa/validation-*
- split: test
path: spa/test-*
dataset_info:
- config_name: eng
features:
- name: filename
dtype: string
- name: label
dtype:
class_label:
names:
'0': aaron
'1': abba
'2': abel
'3': abigail
'4': abilene
'5': abner
'6': abraham
'7': abrahams
'8': abram
'9': adam
'10': agrippa
'11': alexander
'12': alexandria
'13': ammon
'14': amos
'15': andrew
'16': anna
'17': antioch
'18': antiochus
'19': apollonia
'20': arabia
'21': aram
'22': archelaus
'23': ariel
'24': artemis
'25': asa
'26': asher
'27': ashur
'28': asia
'29': assemble
'30': assyria
'31': athens
'32': augustus
'33': babylon
'34': babylonia
'35': bani
'36': barak
'37': barnabas
'38': bartholomew
'39': baruch
'40': bela
'41': benjamin
'42': berea
'43': bernice
'44': beth
'45': bethany
'46': bethel
'47': bethesda
'48': bethlehem
'49': caesar
'50': caesarea
'51': cain
'52': caleb
'53': cana
'54': canaan
'55': carmel
'56': castor
'57': cesar
'58': chios
'59': christ
'60': cilicia
'61': claudia
'62': claudius
'63': clement
'64': corinth
'65': cornelius
'66': crete
'67': cyprus
'68': cyrus
'69': dalmatia
'70': damascus
'71': dan
'72': daniel
'73': darius
'74': david
'75': deborah
'76': demetrius
'77': diana
'78': dinah
'79': dionysius
'80': drusilla
'81': eden
'82': egypt
'83': elam
'84': eli
'85': elia
'86': elias
'87': eliezer
'88': elijah
'89': elim
'90': elisabeth
'91': elizabeth
'92': elon
'93': enoch
'94': enos
'95': ephesus
'96': ephraim
'97': esther
'98': ethan
'99': ethiopia
'100': eunice
'101': euphrates
'102': eve
'103': ezra
'104': felix
'105': gabriel
'106': gad
'107': gaius
'108': galilee
'109': gaza
'110': gideon
'111': gilead
'112': goshen
'113': greece
'114': hadad
'115': hades
'116': hagar
'117': ham
'118': hannah
'119': heber
'120': hebrew
'121': hebron
'122': hermes
'123': hermon
'124': herod
'125': hiram
'126': hosanna
'127': hush
'128': immanuel
'129': india
'130': ira
'131': isaac
'132': isaiah
'133': ishmael
'134': israel
'135': italy
'136': jacob
'137': james
'138': jared
'139': jason
'140': jeremiah
'141': jericho
'142': jerusalem
'143': jesse
'144': jesus
'145': jethro
'146': jew
'147': jezebel
'148': joanna
'149': job
'150': joel
'151': john
'152': jonah
'153': jonas
'154': jonathan
'155': jordan
'156': joseph
'157': joshua
'158': josiah
'159': judah
'160': judas
'161': jude
'162': judith
'163': julia
'164': julius
'165': justus
'166': kos
'167': laban
'168': lazarus
'169': leah
'170': lebanon
'171': levi
'172': libya
'173': linus
'174': lois
'175': lot
'176': lucius
'177': luke
'178': lydia
'179': macedonia
'180': magdalene
'181': magi
'182': maker
'183': malta
'184': mariam
'185': mark
'186': martha
'187': mary
'188': matthew
'189': melchizedek
'190': mesopotamia
'191': messiah
'192': michael
'193': midian
'194': miriam
'195': moab
'196': mordecai
'197': moses
'198': myra
'199': naomi
'200': narcissus
'201': nathanael
'202': nazareth
'203': nebuchadnezzar
'204': nicolas
'205': niger
'206': nile
'207': noah
'208': paul
'209': paulus
'210': perez
'211': persia
'212': peter
'213': pharaoh
'214': philadelphia
'215': philip
'216': phoebe
'217': phoenix
'218': pontus
'219': priscilla
'220': publius
'221': rachel
'222': rebecca
'223': rebekah
'224': reuben
'225': rhoda
'226': rhodes
'227': rome
'228': rufus
'229': salem
'230': salim
'231': salome
'232': samson
'233': samuel
'234': sarah
'235': sardis
'236': satan
'237': saul
'238': seleucia
'239': seth
'240': sharon
'241': shiloh
'242': shout
'243': shun
'244': silas
'245': simeon
'246': simon
'247': sinai
'248': sion
'249': smyrna
'250': sodom
'251': solomon
'252': spain
'253': stephen
'254': susanna
'255': syracuse
'256': syria
'257': tabitha
'258': tabor
'259': tamar
'260': theophilus
'261': thomas
'262': thummim
'263': tiberius
'264': timothy
'265': titus
'266': tobias
'267': tyre
'268': urim
'269': zeus
'270': zion
splits:
- name: train
num_bytes: 1215893
num_examples: 26744
- name: validation
num_bytes: 159193
num_examples: 3491
- name: test
num_bytes: 159142
num_examples: 3491
download_size: 397181
dataset_size: 1534228
- config_name: ind
features:
- name: filename
dtype: string
- name: label
dtype:
class_label:
names:
'0': agustus
'1': anak
'2': asia
'3': dan
'4': kuat
'5': pulau
'6': raja
'7': rumahnya
'8': selama
'9': selamat
'10': selatan
'11': tahan
'12': teman
'13': tuhan
splits:
- name: train
num_bytes: 26080
num_examples: 575
- name: validation
num_bytes: 3756
num_examples: 83
- name: test
num_bytes: 3664
num_examples: 81
download_size: 12806
dataset_size: 33500
- config_name: spa
features:
- name: filename
dtype: string
- name: label
dtype:
class_label:
names:
'0': abel
'1': abismo
'2': adán
'3': agar
'4': alejandro
'5': alejandría
'6': ana
'7': andrés
'8': antioquía
'9': apolo
'10': arabia
'11': artemisa
'12': asia
'13': atenas
'14': augusto
'15': babilonia
'16': benjamín
'17': berenice
'18': bordeando
'19': capadocia
'20': caín
'21': cesar
'22': chipre
'23': claudia
'24': claudio
'25': clemente
'26': consejo
'27': constructor
'28': corinto
'29': cornelio
'30': creta
'31': cristo
'32': cuarto
'33': damasco
'34': dan
'35': daniel
'36': david
'37': demetrio
'38': dionisio
'39': dirigente
'40': efraín
'41': egipto
'42': elisabet
'43': elías
'44': eneas
'45': eran
'46': españa
'47': esteban
'48': etiopía
'49': eva
'50': evita
'51': faraón
'52': felipe
'53': filadelfia
'54': filemón
'55': filólogo
'56': gabriel
'57': gobernaba
'58': grecia
'59': hebreo
'60': hermes
'61': iliria
'62': ira
'63': isaac
'64': israel
'65': italia
'66': jacob
'67': jesús
'68': joel
'69': jordán
'70': josé
'71': juan
'72': juana
'73': judas
'74': judea
'75': judío
'76': julia
'77': julio
'78': justo
'79': libia
'80': lidia
'81': lino
'82': lucas
'83': lucio
'84': lázaro
'85': macedonia
'86': maestros
'87': magdalena
'88': malta
'89': marcos
'90': marta
'91': maría
'92': mateo
'93': matías
'94': mesopotamia
'95': mesías
'96': miguel
'97': narciso
'98': negro
'99': nicanor
'100': nicolás
'101': oiga
'102': olimpo
'103': pablo
'104': partos
'105': paulo
'106': pedro
'107': peor
'108': pesan
'109': pirro
'110': ponto
'111': rebeca
'112': reúnen
'113': roma
'114': rufo
'115': sabios
'116': salem
'117': salmón
'118': salomé
'119': salomón
'120': samuel
'121': santiago
'122': sara
'123': satanás
'124': segundo
'125': sergio
'126': set
'127': señor
'128': simeón
'129': simón
'130': siracusa
'131': siria
'132': situó
'133': sur
'134': susana
'135': tara
'136': tercio
'137': tiberio
'138': timón
'139': tiro
'140': tito
'141': tomás
'142': urbano
'143': viva
'144': zara
'145': zeus
splits:
- name: train
num_bytes: 431605
num_examples: 9283
- name: validation
num_bytes: 57583
num_examples: 1238
- name: test
num_bytes: 57583
num_examples: 1238
download_size: 148219
dataset_size: 546771
---
# Dataset Card for "mswc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5400853157043457,
0.023455914109945297,
0.21532443165779114,
0.1932360976934433,
-0.29077568650245667,
0.20147646963596344,
0.5682920217514038,
-0.25231853127479553,
0.7210506796836853,
0.4830075800418854,
-1.0227525234222412,
-0.7005335688591003,
-0.5646139979362488,
-0.219155117869377... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ashwani21/automatic-reimbursement-tool-demo-incorrect | ashwani21 | 2023-11-24T16:39:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T16:39:45Z | 2023-11-24T16:39:45.000Z | 2023-11-24T16:39:45 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ashwani21/automatic-reimbursement-tool-demo | ashwani21 | 2023-11-24T16:39:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T16:39:45Z | 2023-11-24T16:39:45.000Z | 2023-11-24T16:39:45 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
HamdanXI/arb-eng-parallel-10k-splitted-translated-arabic | HamdanXI | 2023-11-24T17:07:55Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T17:07:55Z | 2023-11-24T17:07:53.000Z | 2023-11-24T17:07:53 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
- name: translated
dtype: string
splits:
- name: train
num_bytes: 4714807
num_examples: 7999
- name: validation
num_bytes: 571638
num_examples: 1000
- name: test
num_bytes: 585646
num_examples: 1000
download_size: 3399538
dataset_size: 5872091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
guima/gemersonnovo | guima | 2023-11-24T17:21:35Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:21:35Z | 2023-11-24T17:11:45.000Z | 2023-11-24T17:11:45 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
RonanMcGovern/test-dataset | RonanMcGovern | 2023-11-24T17:16:57Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T17:16:57Z | 2023-11-24T17:16:57.000Z | 2023-11-24T17:16:57 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ronanarraig/sample_dataset | ronanarraig | 2023-11-27T19:19:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T19:19:16Z | 2023-11-24T17:28:09.000Z | 2023-11-24T17:28:09 | ---
extra_gated_prompt: "Purchase access to this repo [HERE](https://buy.stripe.com/dR616I1mo99D6pabII)"
---
# My Dataset readme | [
-0.6281819343566895,
-0.01388850063085556,
-0.14031657576560974,
0.0035137198865413666,
-0.35284721851348877,
0.15113404393196106,
0.12852418422698975,
-0.051728084683418274,
0.42672568559646606,
1.0254405736923218,
-1.0785431861877441,
-0.64533931016922,
-0.505141019821167,
0.281483948230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Kaue123456/KaueGama | Kaue123456 | 2023-11-24T17:31:45Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:31:45Z | 2023-11-24T17:30:58.000Z | 2023-11-24T17:30:58 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NoobPROBR/Red | NoobPROBR | 2023-11-24T17:31:54Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:31:54Z | 2023-11-24T17:31:35.000Z | 2023-11-24T17:31:35 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Kaue123456/Andrey | Kaue123456 | 2023-11-24T17:32:41Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:32:41Z | 2023-11-24T17:32:13.000Z | 2023-11-24T17:32:13 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NoobPROBR/Roger | NoobPROBR | 2023-11-24T17:33:38Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:33:38Z | 2023-11-24T17:32:43.000Z | 2023-11-24T17:32:43 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
teamwork/teamwork | teamwork | 2023-11-24T17:37:30Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T17:37:30Z | 2023-11-24T17:37:30.000Z | 2023-11-24T17:37:30 | ---
license: mit
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nxsbr/will | nxsbr | 2023-11-24T17:46:13Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T17:46:13Z | 2023-11-24T17:45:55.000Z | 2023-11-24T17:45:55 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cbensimon/CDN | cbensimon | 2023-11-24T17:50:52Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T17:50:52Z | 2023-11-24T17:50:24.000Z | 2023-11-24T17:50:24 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
adalbertoljardim/pozedorodonovo | adalbertoljardim | 2023-11-24T19:14:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T19:14:59Z | 2023-11-24T18:08:35.000Z | 2023-11-24T18:08:35 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Zerobeastskaiai/Karakter_Anime_Pake_Baju_SMA-LoRA | Zerobeastskaiai | 2023-11-24T18:44:09Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T18:44:09Z | 2023-11-24T18:41:09.000Z | 2023-11-24T18:41:09 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/serina_bluearchive | AppleHarem | 2023-11-24T19:06:20Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T19:06:20Z | 2023-11-24T19:06:02.000Z | 2023-11-24T19:06:02 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of serina (Blue Archive)
This is the dataset of serina (Blue Archive), containing 194 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 194 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 528 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 611 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 194 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 194 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 194 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 528 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 528 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 507 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 611 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 611 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6130601763725281,
-0.16893257200717926,
0.38693326711654663,
0.12235689908266068,
-0.26879701018333435,
-0.23226520419120789,
0.26290878653526306,
-0.5377820730209351,
0.7464002370834351,
0.7267542481422424,
-0.8972867131233215,
-0.7491140365600586,
-0.5793976783752441,
0.40940961241722... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Felladrin/Open-Platypus-train.csv | Felladrin | 2023-11-24T20:06:12Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T20:06:12Z | 2023-11-24T19:32:11.000Z | 2023-11-24T19:32:11 | # Garage-bAInd's Open-Platypus dataset formatted and converted to CSV
This is [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) formatted and converted to CSV to train models in [AutoTrain Advanced](https://github.com/huggingface/autotrain-advanced).
| [
-0.05904344841837883,
-0.27903905510902405,
0.10797616094350815,
0.42193353176116943,
-0.15424947440624237,
0.1215866208076477,
-0.20814937353134155,
0.27350160479545593,
0.22308963537216187,
0.5689935088157654,
-0.8553267121315002,
-0.6563433408737183,
-0.23578742146492004,
-0.37109571695... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
brando/Coq-Gym-Data-Set | brando | 2023-11-24T19:41:52Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-24T19:41:52Z | 2023-11-24T19:41:52.000Z | 2023-11-24T19:41:52 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Vinnyh589/PersonagensJogos | Vinnyh589 | 2023-11-24T19:54:49Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-24T19:54:49Z | 2023-11-24T19:54:07.000Z | 2023-11-24T19:54:07 | ---
license: unknown
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sordonia/adauni-v2-flat | sordonia | 2023-11-25T00:35:07Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T00:35:07Z | 2023-11-24T19:56:56.000Z | 2023-11-24T19:56:56 | # Used datasets:
## sordonia/flan-10k-flat
## sordonia/mmlu-qa-flat
## sordonia/platypus-flat
## sordonia/ultrachat-32c-10k-flat
## Total number of tasks: 439
| [
-0.10726218670606613,
-0.1413755863904953,
0.3846971094608307,
0.7099618315696716,
-0.28421762585639954,
-0.11022768914699554,
0.18597127497196198,
0.10952529311180115,
0.5028472542762756,
0.5149446129798889,
-0.9425040483474731,
-0.6543793082237244,
-0.3561030924320221,
0.3299581706523895... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/le_malin_azurlane | AppleHarem | 2023-11-24T20:31:39Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-24T20:31:39Z | 2023-11-24T20:31:19.000Z | 2023-11-24T20:31:19 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of le_malin (Azur Lane)
This is the dataset of le_malin (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 530 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 598 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 530 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 530 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 347 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 598 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 598 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.8232775926589966,
-0.28962185978889465,
0.5295162796974182,
0.10237463563680649,
-0.14660978317260742,
-0.289837121963501,
0.22801025211811066,
-0.4327082931995392,
0.6173948645591736,
0.8222758173942566,
-0.9786547422409058,
-0.8652076721191406,
-0.6635850667953491,
0.40860217809677124... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
harshitaay/Atlas_CodeLlama7bInstruct_Tokenized | harshitaay | 2023-11-26T00:38:55Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T00:38:55Z | 2023-11-24T20:53:58.000Z | 2023-11-24T20:53:58 | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 865791958
num_examples: 150523
- name: validation
num_bytes: 111633790
num_examples: 18815
download_size: 157402763
dataset_size: 977425748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_euclaise__Ferret_7B_public | open-llm-leaderboard | 2023-11-24T20:55:03Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T20:55:03Z | 2023-11-24T20:54:18.000Z | 2023-11-24T20:54:18 | ---
pretty_name: Evaluation run of euclaise/Ferret_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/Ferret_7B](https://huggingface.co/euclaise/Ferret_7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__Ferret_7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-24T20:51:17.073037](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__Ferret_7B_public/blob/main/results_2023-11-24T20-51-17.073037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5942716228548698,\n\
\ \"acc_stderr\": 0.033152282530121875,\n \"acc_norm\": 0.6048893408330033,\n\
\ \"acc_norm_stderr\": 0.03399052086609082,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.3993660994529629,\n\
\ \"mc2_stderr\": 0.014553301107110514,\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06532718120805381,\n\
\ \"f1_stderr\": 0.0014896342146480434\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379983,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192598\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6250746863174667,\n\
\ \"acc_stderr\": 0.004831142570475506,\n \"acc_norm\": 0.8132842063333997,\n\
\ \"acc_norm_stderr\": 0.0038888680996290764\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601677,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601677\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397467,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.026035386098951292,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.026035386098951292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862545,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862545\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930638,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930638\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.01622353351036512,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.01622353351036512\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721537,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721537\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3898305084745763,\n\
\ \"acc_stderr\": 0.012456386619082606,\n \"acc_norm\": 0.3898305084745763,\n\
\ \"acc_norm_stderr\": 0.012456386619082606\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6225490196078431,\n \"acc_stderr\": 0.019610851474880297,\n \
\ \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.019610851474880297\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.3993660994529629,\n\
\ \"mc2_stderr\": 0.014553301107110514\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126742\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \
\ \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06532718120805381,\n\
\ \"f1_stderr\": 0.0014896342146480434\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.02047005307050796,\n \"acc_stderr\": 0.003900413385915721\n\
\ }\n}\n```"
repo_url: https://huggingface.co/euclaise/Ferret_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|arc:challenge|25_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|drop|3_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|gsm8k|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hellaswag|10_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-24T20-51-17.073037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-24T20-51-17.073037.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- '**/details_harness|winogrande|5_2023-11-24T20-51-17.073037.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-24T20-51-17.073037.parquet'
- config_name: results
data_files:
- split: 2023_11_24T20_51_17.073037
path:
- results_2023-11-24T20-51-17.073037.parquet
- split: latest
path:
- results_2023-11-24T20-51-17.073037.parquet
---
# Dataset Card for Evaluation run of euclaise/Ferret_7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/Ferret_7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/Ferret_7B](https://huggingface.co/euclaise/Ferret_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__Ferret_7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-24T20:51:17.073037](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__Ferret_7B_public/blob/main/results_2023-11-24T20-51-17.073037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5942716228548698,
"acc_stderr": 0.033152282530121875,
"acc_norm": 0.6048893408330033,
"acc_norm_stderr": 0.03399052086609082,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.3993660994529629,
"mc2_stderr": 0.014553301107110514,
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06532718120805381,
"f1_stderr": 0.0014896342146480434
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379983,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192598
},
"harness|hellaswag|10": {
"acc": 0.6250746863174667,
"acc_stderr": 0.004831142570475506,
"acc_norm": 0.8132842063333997,
"acc_norm_stderr": 0.0038888680996290764
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601677,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601677
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397467,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.026035386098951292,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.026035386098951292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862545,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862545
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930638,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930638
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.01622353351036512,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.01622353351036512
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.02724561304721537,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.02724561304721537
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3898305084745763,
"acc_stderr": 0.012456386619082606,
"acc_norm": 0.3898305084745763,
"acc_norm_stderr": 0.012456386619082606
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.019610851474880297,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.019610851474880297
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.3993660994529629,
"mc2_stderr": 0.014553301107110514
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126742
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06532718120805381,
"f1_stderr": 0.0014896342146480434
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.003900413385915721
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7461003065109253,
-0.8500183820724487,
0.27222639322280884,
0.237075075507164,
-0.15930230915546417,
-0.0395645909011364,
0.014010133221745491,
-0.2417907863855362,
0.5876929759979248,
-0.03224385157227516,
-0.49011707305908203,
-0.7067011594772339,
-0.41967639327049255,
0.2732288241386... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
haroldim/voz-femi-mult | haroldim | 2023-11-24T21:12:24Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T21:12:24Z | 2023-11-24T21:10:36.000Z | 2023-11-24T21:10:36 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
josedonoso/busi-dataset-v2 | josedonoso | 2023-11-24T21:28:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T21:28:54Z | 2023-11-24T21:28:48.000Z | 2023-11-24T21:28:48 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 271198683.4497682
num_examples: 517
- name: test
num_bytes: 68193092.55023184
num_examples: 130
download_size: 90195143
dataset_size: 339391776.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
RIW/small-coco-wm_50_3 | RIW | 2023-11-24T21:44:01Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T21:44:01Z | 2023-11-24T21:43:07.000Z | 2023-11-24T21:43:07 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: url
dtype: string
- name: key
dtype: string
- name: status
dtype: string
- name: error_message
dtype: 'null'
- name: width
dtype: int64
- name: height
dtype: int64
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: exif
dtype: string
- name: sha256
dtype: string
splits:
- name: train
num_bytes: 1556738716.452
num_examples: 18983
- name: validation
num_bytes: 1686246014.3
num_examples: 18932
download_size: 722637591
dataset_size: 3242984730.752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
furkanakkurt1618/pos_dataset-UD_Turkish-BOUN-v2.13 | furkanakkurt1618 | 2023-11-24T22:36:38Z | 0 | 0 | null | [
"task_categories:token-classification",
"size_categories:1K<n<10K",
"language:tr",
"license:cc-by-sa-4.0",
"region:us"
] | 2023-11-24T22:36:38Z | 2023-11-24T22:20:07.000Z | 2023-11-24T22:20:07 | ---
license: cc-by-sa-4.0
task_categories:
- token-classification
language:
- tr
pretty_name: UD Turkish BOUN Treebank POS Tagging
size_categories:
- 1K<n<10K
--- | [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Kaue123456/DuduJoaoBatista | Kaue123456 | 2023-11-24T22:21:28Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T22:21:28Z | 2023-11-24T22:20:24.000Z | 2023-11-24T22:20:24 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
IshSiva/Safety_Awareness_in_LLMs | IshSiva | 2023-11-24T22:36:20Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T22:36:20Z | 2023-11-24T22:29:06.000Z | 2023-11-24T22:29:06 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MxnBeats/jeleelyeah | MxnBeats | 2023-11-24T22:35:19Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T22:35:19Z | 2023-11-24T22:31:11.000Z | 2023-11-24T22:31:11 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
furkanakkurt1618/pos_dataset-UD_Turkish-IMST-v2.13 | furkanakkurt1618 | 2023-11-24T22:43:23Z | 0 | 0 | null | [
"task_categories:token-classification",
"size_categories:1K<n<10K",
"language:tr",
"license:cc-by-sa-4.0",
"region:us"
] | 2023-11-24T22:43:23Z | 2023-11-24T22:41:23.000Z | 2023-11-24T22:41:23 | ---
license: cc-by-sa-4.0
task_categories:
- token-classification
language:
- tr
pretty_name: UD Turkish IMST Treebank POS Tagging
size_categories:
- 1K<n<10K
--- | [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bilalelmanja/six_sigma | bilalelmanja | 2023-11-24T22:51:13Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T22:51:13Z | 2023-11-24T22:50:39.000Z | 2023-11-24T22:50:39 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jadasdn/trial_Level_2_A | jadasdn | 2023-11-24T23:02:43Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T23:02:43Z | 2023-11-24T22:58:05.000Z | 2023-11-24T22:58:05 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2624020367.8833747
num_examples: 58098
download_size: 2607714351
dataset_size: 2624020367.8833747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "trial_Level_2_A"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.1717277616262436,
-0.2635239064693451,
0.14120742678642273,
0.4978569447994232,
-0.1607837975025177,
0.009813254699110985,
0.6181491613388062,
-0.18527796864509583,
0.5368335247039795,
0.40255531668663025,
-0.807543158531189,
-0.7902584671974182,
-0.7097730040550232,
-0.3929582834243774... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jadasdn/cdataset | jadasdn | 2023-11-24T23:25:16Z | 0 | 0 | null | [
"region:us"
] | 2023-11-24T23:25:16Z | 2023-11-24T23:24:21.000Z | 2023-11-24T23:24:21 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AcasoQ/coachvoz | AcasoQ | 2023-11-24T23:29:31Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-24T23:29:31Z | 2023-11-24T23:28:14.000Z | 2023-11-24T23:28:14 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pipp/SplatTest | Pipp | 2023-11-24T23:31:27Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-24T23:31:27Z | 2023-11-24T23:30:55.000Z | 2023-11-24T23:30:55 | ---
license: mit
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
shredder-31/NeuraLearnAcademy-Llama-2-7b-QG-dataset-Prdiction | shredder-31 | 2023-11-25T00:01:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T00:01:35Z | 2023-11-25T00:01:30.000Z | 2023-11-25T00:01:30 | ---
dataset_info:
features:
- name: prdiction
dtype: string
splits:
- name: train
num_bytes: 10383
num_examples: 25
download_size: 8411
dataset_size: 10383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
adalbertoljardim/poze | adalbertoljardim | 2023-11-25T00:18:00Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T00:18:00Z | 2023-11-25T00:17:11.000Z | 2023-11-25T00:17:11 | ---
license: openrail
---
| [
-0.12853369116783142,
-0.18616779148578644,
0.6529126167297363,
0.49436280131340027,
-0.193193256855011,
0.2360745668411255,
0.36071979999542236,
0.05056314915418625,
0.5793651342391968,
0.740013837814331,
-0.6508103013038635,
-0.23783960938453674,
-0.7102248668670654,
-0.04782580211758613... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Smurfzin | Pablao0948 | 2023-11-25T00:42:26Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T00:42:26Z | 2023-11-25T00:42:08.000Z | 2023-11-25T00:42:08 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NIKMIKE/MirabelSinging | NIKMIKE | 2023-11-25T01:21:37Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T01:21:37Z | 2023-11-25T01:20:58.000Z | 2023-11-25T01:20:58 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hugosousa/SmallTimelines | hugosousa | 2023-11-25T17:36:32Z | 0 | 0 | null | [
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | 2023-11-25T17:36:32Z | 2023-11-25T01:27:44.000Z | 2023-11-25T01:27:44 | ---
license: mit
language:
- en
pretty_name: Small Timelines
size_categories:
- 10K<n<100K
configs:
- config_name: one
data_files:
- split: train
path: "one/train.json"
- split: test
path: "one/test.json"
- config_name: two
data_files:
- split: train
path: "two/train.json"
- split: test
path: "two/test.json"
- config_name: three
data_files:
- split: train
path: "three/train.json"
- split: test
path: "three/test.json"
- config_name: four
data_files:
- split: train
path: "four/train.json"
- split: test
path: "four/test.json"
- config_name: five
data_files:
- split: train
path: "five/train.json"
- split: test
path: "five/test.json"
--- | [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
music09/selena | music09 | 2023-11-25T02:02:30Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T02:02:30Z | 2023-11-25T01:28:15.000Z | 2023-11-25T01:28:15 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Chinchis/danduran | Chinchis | 2023-11-25T01:33:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T01:33:45Z | 2023-11-25T01:29:55.000Z | 2023-11-25T01:29:55 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NIKMIKE/MIRABEL | NIKMIKE | 2023-11-25T01:32:28Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T01:32:28Z | 2023-11-25T01:31:28.000Z | 2023-11-25T01:31:28 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public | open-llm-leaderboard | 2023-11-25T01:32:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T01:32:24Z | 2023-11-25T01:31:39.000Z | 2023-11-25T01:31:39 | ---
pretty_name: Evaluation run of Sayan01/Llama-Flan-XL2base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sayan01/Llama-Flan-XL2base](https://huggingface.co/Sayan01/Llama-Flan-XL2base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T01:29:13.925640](https://huggingface.co/datasets/open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public/blob/main/results_2023-11-25T01-29-13.925640.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23221079815429288,\n\
\ \"acc_stderr\": 0.02994811714846116,\n \"acc_norm\": 0.23187497505966656,\n\
\ \"acc_norm_stderr\": 0.030736580620987688,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.5058224656335896,\n\
\ \"mc2_stderr\": 0.016425425630600676,\n \"em\": 0.00010486577181208053,\n\
\ \"em_stderr\": 0.00010486577181208623,\n \"f1\": 0.0029037332214765076,\n\
\ \"f1_stderr\": 0.0002952362942135874\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1757679180887372,\n \"acc_stderr\": 0.01112285086312048,\n\
\ \"acc_norm\": 0.20648464163822525,\n \"acc_norm_stderr\": 0.011828865619002316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2592113124875523,\n\
\ \"acc_stderr\": 0.004373062283376514,\n \"acc_norm\": 0.2533359888468433,\n\
\ \"acc_norm_stderr\": 0.0043403282041351975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.011064151027165443,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.011064151027165443\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.01500067437357034,\n\
\ \"mc2\": 0.5058224656335896,\n \"mc2_stderr\": 0.016425425630600676\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5090765588003157,\n\
\ \"acc_stderr\": 0.014050170094497704\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.00010486577181208623,\n\
\ \"f1\": 0.0029037332214765076,\n \"f1_stderr\": 0.0002952362942135874\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Sayan01/Llama-Flan-XL2base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|arc:challenge|25_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|drop|3_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|gsm8k|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hellaswag|10_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T01-29-13.925640.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- '**/details_harness|winogrande|5_2023-11-25T01-29-13.925640.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T01-29-13.925640.parquet'
- config_name: results
data_files:
- split: 2023_11_25T01_29_13.925640
path:
- results_2023-11-25T01-29-13.925640.parquet
- split: latest
path:
- results_2023-11-25T01-29-13.925640.parquet
---
# Dataset Card for Evaluation run of Sayan01/Llama-Flan-XL2base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sayan01/Llama-Flan-XL2base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sayan01/Llama-Flan-XL2base](https://huggingface.co/Sayan01/Llama-Flan-XL2base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T01:29:13.925640](https://huggingface.co/datasets/open-llm-leaderboard/details_Sayan01__Llama-Flan-XL2base_public/blob/main/results_2023-11-25T01-29-13.925640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23221079815429288,
"acc_stderr": 0.02994811714846116,
"acc_norm": 0.23187497505966656,
"acc_norm_stderr": 0.030736580620987688,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.5058224656335896,
"mc2_stderr": 0.016425425630600676,
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208623,
"f1": 0.0029037332214765076,
"f1_stderr": 0.0002952362942135874
},
"harness|arc:challenge|25": {
"acc": 0.1757679180887372,
"acc_stderr": 0.01112285086312048,
"acc_norm": 0.20648464163822525,
"acc_norm_stderr": 0.011828865619002316
},
"harness|hellaswag|10": {
"acc": 0.2592113124875523,
"acc_stderr": 0.004373062283376514,
"acc_norm": 0.2533359888468433,
"acc_norm_stderr": 0.0043403282041351975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165443,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165443
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.5058224656335896,
"mc2_stderr": 0.016425425630600676
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.014050170094497704
},
"harness|drop|3": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208623,
"f1": 0.0029037332214765076,
"f1_stderr": 0.0002952362942135874
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7165306210517883,
-0.8306719064712524,
0.28212279081344604,
0.20160330832004547,
-0.1621747463941574,
-0.061660196632146835,
0.0002363904204685241,
-0.2581086754798889,
0.6127928495407104,
-0.02242851071059704,
-0.5046493411064148,
-0.6576284170150757,
-0.44447630643844604,
0.2722079157... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kalomaze/MiniSymposium-Demo-Dataset | kalomaze | 2023-11-25T01:56:44Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T01:56:44Z | 2023-11-25T01:56:06.000Z | 2023-11-25T01:56:06 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/nagato_azurlane | AppleHarem | 2023-11-25T02:00:03Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-25T02:00:03Z | 2023-11-25T01:59:38.000Z | 2023-11-25T01:59:38 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagato (Azur Lane)
This is the dataset of nagato (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 520 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 584 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 520 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 520 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 406 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 584 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 584 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.7306652069091797,
-0.2319800853729248,
0.5097885131835938,
0.0874442309141159,
-0.16297823190689087,
-0.1299002766609192,
0.23620565235614777,
-0.43810921907424927,
0.5746597051620483,
0.7724752426147461,
-0.8560051321983337,
-0.8051512241363525,
-0.5158411264419556,
0.44655466079711914... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
gustavindocapars/sserra | gustavindocapars | 2023-11-25T02:36:44Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T02:36:44Z | 2023-11-25T02:36:17.000Z | 2023-11-25T02:36:17 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
guima/mlteste | guima | 2023-11-25T03:17:02Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T03:17:02Z | 2023-11-25T02:56:06.000Z | 2023-11-25T02:56:06 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
LeoMito/sla | LeoMito | 2023-11-25T02:56:48Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T02:56:48Z | 2023-11-25T02:56:48.000Z | 2023-11-25T02:56:48 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
furry-br/barbie | furry-br | 2023-11-25T02:57:43Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T02:57:43Z | 2023-11-25T02:57:30.000Z | 2023-11-25T02:57:30 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bot-yaya/undl_zh2en_translation | bot-yaya | 2023-11-25T04:01:46Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T04:01:46Z | 2023-11-25T03:16:58.000Z | 2023-11-25T03:16:58 | ---
dataset_info:
features:
- name: clean_zh
sequence: string
- name: clean_en
sequence: string
- name: record
dtype: string
- name: zh2en
sequence: string
splits:
- name: train
num_bytes: 13263355893
num_examples: 165840
download_size: 6373670636
dataset_size: 13263355893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "undl_zh2en_translation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.36738333106040955,
-0.007628054358065128,
0.24234452843666077,
0.22498483955860138,
-0.4863714873790741,
-0.05069827288389206,
-0.0177144818007946,
-0.26499223709106445,
0.5285458564758301,
0.48849615454673767,
-0.9127941727638245,
-0.9340790510177612,
-0.5130995512008667,
-0.1386667340... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_TheBloke__Orca-2-13B-GPTQ_public | open-llm-leaderboard | 2023-11-25T03:46:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T03:46:14Z | 2023-11-25T03:45:31.000Z | 2023-11-25T03:45:31 | ---
pretty_name: Evaluation run of TheBloke/Orca-2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Orca-2-13B-GPTQ](https://huggingface.co/TheBloke/Orca-2-13B-GPTQ) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Orca-2-13B-GPTQ_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T03:42:21.410226](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Orca-2-13B-GPTQ_public/blob/main/results_2023-11-25T03-42-21.410226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5887851314518572,\n\
\ \"acc_stderr\": 0.032958137391722146,\n \"acc_norm\": 0.5969185976587905,\n\
\ \"acc_norm_stderr\": 0.03368773395313244,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5514098320774886,\n\
\ \"mc2_stderr\": 0.0160327733300155,\n \"em\": 0.42606963087248323,\n\
\ \"em_stderr\": 0.0050641847856105855,\n \"f1\": 0.5302139261744996,\n\
\ \"f1_stderr\": 0.004659796001509701\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.01450068261821286,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6037641904003187,\n\
\ \"acc_stderr\": 0.004881148866874181,\n \"acc_norm\": 0.7911770563632743,\n\
\ \"acc_norm_stderr\": 0.004056369096954941\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.029946498567699948,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.029946498567699948\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.03801685104524458,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.03801685104524458\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105655,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709595,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709595\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20782122905027933,\n\
\ \"acc_stderr\": 0.013570248325081347,\n \"acc_norm\": 0.20782122905027933,\n\
\ \"acc_norm_stderr\": 0.013570248325081347\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616302,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616302\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5514098320774886,\n\
\ \"mc2_stderr\": 0.0160327733300155\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.42606963087248323,\n \
\ \"em_stderr\": 0.0050641847856105855,\n \"f1\": 0.5302139261744996,\n\
\ \"f1_stderr\": 0.004659796001509701\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.155420773313116,\n \"acc_stderr\": 0.009979689409499152\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Orca-2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|arc:challenge|25_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|drop|3_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|gsm8k|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hellaswag|10_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T03-42-21.410226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T03-42-21.410226.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- '**/details_harness|winogrande|5_2023-11-25T03-42-21.410226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T03-42-21.410226.parquet'
- config_name: results
data_files:
- split: 2023_11_25T03_42_21.410226
path:
- results_2023-11-25T03-42-21.410226.parquet
- split: latest
path:
- results_2023-11-25T03-42-21.410226.parquet
---
# Dataset Card for Evaluation run of TheBloke/Orca-2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Orca-2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Orca-2-13B-GPTQ](https://huggingface.co/TheBloke/Orca-2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Orca-2-13B-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T03:42:21.410226](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Orca-2-13B-GPTQ_public/blob/main/results_2023-11-25T03-42-21.410226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5887851314518572,
"acc_stderr": 0.032958137391722146,
"acc_norm": 0.5969185976587905,
"acc_norm_stderr": 0.03368773395313244,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.5514098320774886,
"mc2_stderr": 0.0160327733300155,
"em": 0.42606963087248323,
"em_stderr": 0.0050641847856105855,
"f1": 0.5302139261744996,
"f1_stderr": 0.004659796001509701
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.01450068261821286,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.6037641904003187,
"acc_stderr": 0.004881148866874181,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.004056369096954941
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.029946498567699948,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.029946498567699948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.03801685104524458,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.03801685104524458
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105655,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709595,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709595
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647897,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20782122905027933,
"acc_stderr": 0.013570248325081347,
"acc_norm": 0.20782122905027933,
"acc_norm_stderr": 0.013570248325081347
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616302,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616302
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777508,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.5514098320774886,
"mc2_stderr": 0.0160327733300155
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|drop|3": {
"em": 0.42606963087248323,
"em_stderr": 0.0050641847856105855,
"f1": 0.5302139261744996,
"f1_stderr": 0.004659796001509701
},
"harness|gsm8k|5": {
"acc": 0.155420773313116,
"acc_stderr": 0.009979689409499152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6822293400764465,
-0.8855184316635132,
0.24559719860553741,
0.17046283185482025,
-0.1974889189004898,
-0.05417296662926674,
0.03532731160521507,
-0.2711780071258545,
0.5616278648376465,
-0.03354009613394737,
-0.4708154797554016,
-0.7059969305992126,
-0.42084020376205444,
0.2080485522747... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DiegoMVM/DefinicionesDerechoPeruano | DiegoMVM | 2023-11-25T03:49:11Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T03:49:11Z | 2023-11-25T03:48:28.000Z | 2023-11-25T03:48:28 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
malaysia-ai/mosaic-combine-all | malaysia-ai | 2023-11-28T08:18:52Z | 0 | 0 | null | [
"language:ms",
"region:us"
] | 2023-11-28T08:18:52Z | 2023-11-25T03:53:14.000Z | 2023-11-25T03:53:14 | ---
language:
- ms
---
# Mosaic format for combine all dataset to train Malaysian LLM
This repository is to store dataset shards using mosaic format.
1. prepared at https://github.com/malaysia-ai/dedup-text-dataset/blob/main/pretrain-llm/combine-all.ipynb
2. using tokenizer https://huggingface.co/malaysia-ai/bpe-tokenizer
3. 4096 context length.
## how-to
1. git clone,
```bash
git lfs clone https://huggingface.co/datasets/malaysia-ai/mosaic-combine-all
```
2. load it,
```python
from streaming import LocalDataset
import numpy as np
from streaming.base.format.mds.encodings import Encoding, _encodings
class UInt16(Encoding):
def encode(self, obj) -> bytes:
return obj.tobytes()
def decode(self, data: bytes):
return np.frombuffer(data, np.uint16)
_encodings['uint16'] = UInt16
dataset = LocalDataset('mosaic-combine-all')
len(dataset)
``` | [
-0.47123393416404724,
-0.23117363452911377,
0.1320270597934723,
0.580291211605072,
-0.7604060173034668,
0.026354866102337837,
-0.22735178470611572,
0.02346676215529442,
0.7793804407119751,
0.5408183336257935,
-0.5994640588760376,
-0.5018149018287659,
-0.644705593585968,
0.25355225801467896... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
BangumiBase/madeinabyss | BangumiBase | 2023-11-25T08:04:16Z | 0 | 0 | null | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2023-11-25T08:04:16Z | 2023-11-25T05:19:02.000Z | 2023-11-25T05:19:02 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Made In Abyss
This is the image base of bangumi Made in Abyss, we detected 35 characters, 3476 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 95 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 81 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 46 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 25 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 116 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 374 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 38 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 37 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 1042 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 77 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 17 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 34 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 8 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 64 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 15 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 5 | [Download](15/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 16 | 26 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 731 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 40 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 133 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 23 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 6 | [Download](21/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 22 | 64 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 28 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 21 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 16 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 14 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 9 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 15 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 9 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 12 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 19 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 45 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 5 | [Download](33/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 186 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| [
-0.7061129212379456,
-0.16650788486003876,
0.16299982368946075,
0.2264678031206131,
-0.3013360798358917,
-0.07863856106996536,
0.03180072084069252,
-0.42233940958976746,
0.681890606880188,
0.5297341346740723,
-0.9174515008926392,
-0.8557584285736084,
-0.6689456701278687,
0.5244402885437012... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yangjinlong/gz | yangjinlong | 2023-11-25T23:41:02Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-25T23:41:02Z | 2023-11-25T05:20:26.000Z | 2023-11-25T05:20:26 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vilm/small_ocr | vilm | 2023-11-25T05:51:18Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T05:51:18Z | 2023-11-25T05:29:35.000Z | 2023-11-25T05:29:35 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hyonee/hyoneeset | hyonee | 2023-11-25T05:39:13Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T05:39:13Z | 2023-11-25T05:39:13.000Z | 2023-11-25T05:39:13 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
arpitdvd/sample_font_aesthetics_ds | arpitdvd | 2023-11-25T08:08:10Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-25T08:08:10Z | 2023-11-25T06:14:52.000Z | 2023-11-25T06:14:52 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/shimakaze_azurlane | AppleHarem | 2023-11-25T06:34:54Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-25T06:34:54Z | 2023-11-25T06:34:37.000Z | 2023-11-25T06:34:37 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shimakaze (Azur Lane)
This is the dataset of shimakaze (Azur Lane), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 555 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 602 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 555 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 555 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 474 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 602 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 602 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6974093914031982,
-0.4235075116157532,
0.5986726880073547,
0.12137041985988617,
-0.08031627535820007,
-0.18456973135471344,
0.13127848505973816,
-0.45799341797828674,
0.5794039368629456,
0.7469390630722046,
-1.0602272748947144,
-0.9902631640434265,
-0.6158269047737122,
0.274251669645309... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Shayan01/islamic-data | Shayan01 | 2023-11-25T07:11:34Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-25T07:11:34Z | 2023-11-25T06:42:25.000Z | 2023-11-25T06:42:25 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Acasac2145/An-Azure-Resource | Acasac2145 | 2023-11-25T06:51:12Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T06:51:12Z | 2023-11-25T06:50:26.000Z | 2023-11-25T06:50:26 | Future Developments and Enhancements
Azure is a dynamic ecosystem, constantly evolving to meet the ever-changing needs of its <a href="https://dumpsmedia.com/education/an-azure-resource-can-have-multiple-delete-locks/">An Azure Resource can have Multiple Delete Locks</a> users. This section provides a glimpse into the future, discussing potential developments and enhancements related to Resource Locks, ensuring users stay ahead in their quest for a secure Azure environment.
The Evolving Landscape of Azure Security
In conclusion, the article reflects on the evolving nature of Azure security. By adopting the principles discussed within, users can navigate the intricate landscape of cloud security with confidence, knowing they are equipped with the knowledge to safeguard their resources effectively.
An Azure Resource can have Multiple Delete Locks
Azure, Microsoft's groundbreaking cloud computing platform, is empowering businesses and developers worldwide with its vast array of services. Among the many features ensuring the security and integrity of resources, one notable capability stands out – the ability of an Azure Resource to have multiple Delete Locks.
Understanding Azure Resource Locks
Azure Resource Locks are akin to a digital fortress, fortifying your critical assets against accidental or unauthorized alterations. These locks come in different flavors, each <a href="https://dumpsmedia.com/education/an-azure-resource-can-have-multiple-delete-locks/">An Azure Resource can have Multiple Delete Locks</a> tailored to meet specific security needs. One such lock, the Delete Lock, acts as a stalwart guardian against unintended deletions.
Click Here For More Details>>>>>>>https://dumpsmedia.com/education/an-azure-resource-can-have-multiple-delete-locks/
| [
-0.6410713791847229,
-0.5953699946403503,
0.16635532677173615,
-0.08270999789237976,
-0.002296151826158166,
0.7354683876037598,
0.16129310429096222,
-0.7006428241729736,
0.08808707445859909,
0.5318050384521484,
-0.8676961064338684,
-0.19652408361434937,
-0.5065197944641113,
0.1493515074253... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
glogani/clothing | glogani | 2023-11-25T07:42:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T07:42:50Z | 2023-11-25T07:42:50.000Z | 2023-11-25T07:42:50 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
akash140500/mini-platypus | akash140500 | 2023-11-25T07:49:12Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T07:49:12Z | 2023-11-25T07:49:11.000Z | 2023-11-25T07:49:11 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245924
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/yoshimi_bluearchive | AppleHarem | 2023-11-25T07:54:02Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-25T07:54:02Z | 2023-11-25T07:53:46.000Z | 2023-11-25T07:53:46 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yoshimi (Blue Archive)
This is the dataset of yoshimi (Blue Archive), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 564 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 660 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 564 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 564 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 529 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 660 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 660 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.7109822630882263,
-0.27412691712379456,
0.431176096200943,
0.10297146439552307,
-0.24643127620220184,
-0.12808504700660706,
0.2059015929698944,
-0.5372692346572876,
0.7672677636146545,
0.630973219871521,
-0.908695638179779,
-0.6875374913215637,
-0.5310975909233093,
0.3457244336605072,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AppleHarem/tsurugi_bluearchive | AppleHarem | 2023-11-25T08:08:32Z | 0 | 0 | null | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-11-25T08:08:32Z | 2023-11-25T08:08:12.000Z | 2023-11-25T08:08:12 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsurugi (Blue Archive)
This is the dataset of tsurugi (Blue Archive), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 531 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 667 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 531 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 531 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 485 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 667 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 667 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| [
-0.6867592334747314,
-0.26952531933784485,
0.3735906779766083,
0.17559677362442017,
-0.30072981119155884,
-0.06354427337646484,
0.0653187707066536,
-0.5054796934127808,
0.6986523866653442,
0.6034793257713318,
-0.8502821922302246,
-0.7664601802825928,
-0.5710429549217224,
0.3903740048408508... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
EricSe/Mysql | EricSe | 2023-11-25T08:48:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T08:48:54Z | 2023-11-25T08:47:39.000Z | 2023-11-25T08:47:39 | CREATE DATABASE halo; | [
-0.8765856623649597,
-0.5717957615852356,
0.16451536118984222,
0.8381860852241516,
-0.10301461815834045,
0.04515711963176727,
0.25969091057777405,
-0.12749452888965607,
0.761744499206543,
1.137235164642334,
-0.7669155597686768,
-0.5252682566642761,
0.20721490681171417,
-0.07596991211175919... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SergioSCA/Visionv3 | SergioSCA | 2023-11-26T17:17:04Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-26T17:17:04Z | 2023-11-25T08:59:28.000Z | 2023-11-25T08:59:28 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | [
-0.5322356224060059,
-0.5534716844558716,
0.1290130317211151,
0.23470577597618103,
-0.39626216888427734,
-0.11762470006942749,
-0.03545305132865906,
-0.6389272212982178,
0.5699822306632996,
0.7838326692581177,
-0.7834625840187073,
-0.9173274040222168,
-0.55633145570755,
0.13078093528747559... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Rithvik05/trial | Rithvik05 | 2023-11-25T09:09:29Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T09:09:29Z | 2023-11-25T09:09:29.000Z | 2023-11-25T09:09:29 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.