datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
BangumiBase/konoototomare | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kono Oto Tomare!
This is the image base of bangumi Kono Oto Tomare!, we detected 34 characters, 3706 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 489 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 61 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 270 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 572 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 146 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 134 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 154 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 45 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 56 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 42 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 26 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 166 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 75 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 43 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 12 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 18 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 50 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 78 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 188 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 24 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 13 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 162 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 29 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 565 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 33 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 11 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 9 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 14 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 17 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 9 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 40 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 9 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 136 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
roleplay4fun/20240327_pippa_segmented_long_experiment_01 | ---
dataset_info:
features:
- name: segments
list:
- name: label
dtype: bool
- name: text
dtype: string
splits:
- name: train
num_bytes: 76219243.21145765
num_examples: 3985
download_size: 59627128
dataset_size: 76219243.21145765
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_mmlu_tr_conf_mgpt_nearestscore_true_y | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83747
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
y2312566/dataset | ---
license: openrail
task_categories:
- text-classification
language:
- en
tags:
- not-for-all-audiences
pretty_name: good
size_categories:
- 100K<n<1M
--- |
disi-unibo-nlp/COMMA | ---
dataset_info:
- config_name: en
features:
- name: id
dtype: string
- name: ruling_type
dtype: int64
- name: epigraph
dtype: string
- name: body
dtype: string
- name: decision
dtype: string
- name: maxims_text
dtype: string
- name: maxims_title
dtype: string
- name: full_text
dtype: string
- name: num_maxims
dtype: int64
- name: maxims_len
dtype: int64
- name: full_text_len
dtype: int64
- name: judgment_type
dtype: int64
- name: constitutional_parameters
dtype: string
- name: maxims
dtype: string
splits:
- name: train
num_bytes: 555145830
num_examples: 12600
- name: test
num_bytes: 30737608
num_examples: 700
- name: validation
num_bytes: 31671019
num_examples: 700
download_size: 278441383
dataset_size: 617554457
- config_name: es
features:
- name: id
dtype: string
- name: ruling_type
dtype: int64
- name: epigraph
dtype: string
- name: body
dtype: string
- name: decision
dtype: string
- name: maxims_text
dtype: string
- name: maxims_title
dtype: string
- name: full_text
dtype: string
- name: num_maxims
dtype: int64
- name: maxims_len
dtype: int64
- name: full_text_len
dtype: int64
- name: judgment_type
dtype: int64
- name: constitutional_parameters
dtype: string
- name: maxims
dtype: string
splits:
- name: train
num_bytes: 575679719
num_examples: 12600
- name: test
num_bytes: 31896832
num_examples: 700
- name: validation
num_bytes: 32827830
num_examples: 700
download_size: 300803577
dataset_size: 640404381
- config_name: fr
features:
- name: id
dtype: string
- name: ruling_type
dtype: int64
- name: epigraph
dtype: string
- name: body
dtype: string
- name: decision
dtype: string
- name: maxims_text
dtype: string
- name: maxims_title
dtype: string
- name: full_text
dtype: string
- name: num_maxims
dtype: int64
- name: maxims_len
dtype: int64
- name: full_text_len
dtype: int64
- name: judgment_type
dtype: int64
- name: constitutional_parameters
dtype: string
- name: maxims
dtype: string
splits:
- name: train
num_bytes: 580985816
num_examples: 12600
- name: test
num_bytes: 32177379
num_examples: 700
- name: validation
num_bytes: 33152939
num_examples: 700
download_size: 306338176
dataset_size: 646316134
- config_name: it
features:
- name: id
dtype: string
- name: ruling_type
dtype: int64
- name: epigraph
dtype: string
- name: body
dtype: string
- name: decision
dtype: string
- name: maxims_text
dtype: string
- name: maxims_title
dtype: string
- name: full_text
dtype: string
- name: num_maxims
dtype: int64
- name: maxims_len
dtype: int64
- name: full_text_len
dtype: int64
- name: judgment_type
dtype: int64
- name: constitutional_parameters
dtype: string
- name: maxims
dtype: string
splits:
- name: train
num_bytes: 557553146
num_examples: 12600
- name: test
num_bytes: 30850184
num_examples: 700
- name: validation
num_bytes: 31775341
num_examples: 700
download_size: 293523614
dataset_size: 620178671
configs:
- config_name: en
data_files:
- split: train
path: en/train-*
- split: test
path: en/test-*
- split: validation
path: en/validation-*
- config_name: es
data_files:
- split: train
path: es/train-*
- split: test
path: es/test-*
- split: validation
path: es/validation-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- split: test
path: fr/test-*
- split: validation
path: fr/validation-*
- config_name: it
data_files:
- split: train
path: it/train-*
- split: test
path: it/test-*
- split: validation
path: it/validation-*
---
### Dataset Summary
COMMA is a constitutional multi-task and multi-lingual archive consisting of 14K CCIR rulings with expert-authored annotations. It embodies distinctive features that render it a valuable object of study for broader NLP research.
### Languages
Italian, English, Spanish, French
## Dataset
### Data Fields
The dataset contains a list of instances (rulings); each instance contains the following data:
| Field | Description |
|-------------------------: | ------------------------------------------------: |
| id | `(str)` The ruling ID |
| ruling_type | `(int)` The ruling type |
| epigraph | `(str)` The ruling epigraph |
| text | `(str)` The ruling text |
| decision | `(str)` The ruling decision |
| maxims_text | `(str)` The text of ruling maxims |
| maxims_title | `(str)` The title of ruling maxims |
| full_text | `(str)` The ruling full_text |
| num_maxims | `(int)` The number of maxims |
| maxims_len | `(int)` The length of maxims |
| full_text_len | `(int)` The length of the full text |
| judgment_type | `(int)` The judgment type |
| constitutional_parameters | `(List[List[str]])` The constitutional parameters |
| maxims | `(dict)` The maxims' numbers, texts, and titles |
Please check the exemplar usage below for loading the data:
```python
from datasets import load_dataset
comma_en = load_dataset("disi-unibo-nlp/COMMA", "en")
# Download comma_en locally and load it as a Dataset object.
example = comma_en["validation"][0] # The first instance of the dev set
example["full_text"] # The full text (i.e., epigraph + text + decision) for the ruling
print(example['maxims_title']) # The corresponding maxims title for the ruling
```
### Data Splits
| IT | Instances |
| ----------: | --------: |
| Train (90%) | 12,600 |
| Test (5%) | 700 |
| Dev (5%) | 700 |
| EN | Instances |
| ----------: | --------: |
| Train (90%) | 12,600 |
| Test (5%) | 700 |
| Dev (5%) | 700 |
| ES | Instances |
| ----------: | --------: |
| Train (90%) | 12,600 |
| Test (5%) | 700 |
| Dev (5%) | 700 |
| FR | Instances |
| ----------: | --------: |
| Train (90%) | 12,600 |
| Test (5%) | 700 |
| Dev (5%) | 700 |
|
SoAp9035/Turkish_TinyStories | ---
license: cdla-sharing-1.0
language:
- tr
---
# Turkish TinyStories Large
### License: CDLA-Sharing-1.0
This is a translated version of the stories from [roneneldan/TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories) dataset. |
rhaymison/mental-health-pt | ---
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 13216762
num_examples: 11717
download_size: 3469263
dataset_size: 13216762
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- pt
tags:
- medical
--- |
madrylab/imagenet-star-tokens | ---
license: mit
---
This dataset contains the tokens for ImageNet* from the paper [Dataset Interfaces: Diagnosing Model Failures Using Controllable Counterfactual Generation](https://arxiv.org/abs/2302.07865)
Download the tokens from the files page, or run:
```
wget https://huggingface.co/datasets/madrylab/imagenet-star-tokens/resolve/main/tokens.zip
``` |
Gabriel1322/lucaslira | ---
license: openrail
---
|
marcus2000/timelist_task_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Original
dtype: string
- name: Task
dtype: string
splits:
- name: train
num_bytes: 91073.55102040817
num_examples: 41
- name: test
num_bytes: 17770.448979591838
num_examples: 8
download_size: 62081
dataset_size: 108844.0
---
# Dataset Card for "timelist_task_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huseinbud/test | ---
license: apache-2.0
---
|
dawidkubicki/ner_crypto_news | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 62388
num_examples: 152
- name: validation
num_bytes: 13265
num_examples: 32
- name: test
num_bytes: 14322
num_examples: 34
download_size: 36662
dataset_size: 89975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
growth-cadet/eval04b_raw | ---
dataset_info:
features:
- name: ats
dtype: string
- name: context
dtype: string
- name: sys5_obj
struct:
- name: focus_areas
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: industries
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: products_and_technologies
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: eval_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
- name: eval_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
- name: uuid
dtype: string
- name: prompt
dtype: string
- name: raw_output
dtype: string
splits:
- name: train
num_bytes: 30836261
num_examples: 2229
download_size: 14405867
dataset_size: 30836261
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bleedchocolate/eng-hutv2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42170
num_examples: 406
download_size: 10953
dataset_size: 42170
---
# Dataset Card for "eng-hutv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B | ---
pretty_name: Evaluation run of spmurrayzzz/Mistral-Syndicate-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [spmurrayzzz/Mistral-Syndicate-7B](https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T05:59:03.827358](https://huggingface.co/datasets/open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B/blob/main/results_2023-12-30T05-59-03.827358.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.605141246638436,\n\
\ \"acc_stderr\": 0.03295805344662521,\n \"acc_norm\": 0.6090522236898664,\n\
\ \"acc_norm_stderr\": 0.03362572955811539,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.43728309890245215,\n\
\ \"mc2_stderr\": 0.014415164176795973\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.01449442158425652,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6285600477992431,\n\
\ \"acc_stderr\": 0.004822022254886021,\n \"acc_norm\": 0.8288189603664609,\n\
\ \"acc_norm_stderr\": 0.0037589728166275895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752056,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n\
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508773,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508773\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22569832402234638,\n\
\ \"acc_stderr\": 0.013981395058455057,\n \"acc_norm\": 0.22569832402234638,\n\
\ \"acc_norm_stderr\": 0.013981395058455057\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399684,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.43728309890245215,\n\
\ \"mc2_stderr\": 0.014415164176795973\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4404852160727824,\n \
\ \"acc_stderr\": 0.013674572131693888\n }\n}\n```"
repo_url: https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|arc:challenge|25_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|arc:challenge|25_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|gsm8k|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|gsm8k|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hellaswag|10_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hellaswag|10_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-51-29.447448.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T05-59-03.827358.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- '**/details_harness|winogrande|5_2023-12-30T05-51-29.447448.parquet'
- split: 2023_12_30T05_59_03.827358
path:
- '**/details_harness|winogrande|5_2023-12-30T05-59-03.827358.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T05-59-03.827358.parquet'
- config_name: results
data_files:
- split: 2023_12_30T05_51_29.447448
path:
- results_2023-12-30T05-51-29.447448.parquet
- split: 2023_12_30T05_59_03.827358
path:
- results_2023-12-30T05-59-03.827358.parquet
- split: latest
path:
- results_2023-12-30T05-59-03.827358.parquet
---
# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [spmurrayzzz/Mistral-Syndicate-7B](https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T05:59:03.827358](https://huggingface.co/datasets/open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B/blob/main/results_2023-12-30T05-59-03.827358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.605141246638436,
"acc_stderr": 0.03295805344662521,
"acc_norm": 0.6090522236898664,
"acc_norm_stderr": 0.03362572955811539,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.43728309890245215,
"mc2_stderr": 0.014415164176795973
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.01449442158425652,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938215
},
"harness|hellaswag|10": {
"acc": 0.6285600477992431,
"acc_stderr": 0.004822022254886021,
"acc_norm": 0.8288189603664609,
"acc_norm_stderr": 0.0037589728166275895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752056,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508773,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508773
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22569832402234638,
"acc_stderr": 0.013981395058455057,
"acc_norm": 0.22569832402234638,
"acc_norm_stderr": 0.013981395058455057
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399684,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.43728309890245215,
"mc2_stderr": 0.014415164176795973
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.4404852160727824,
"acc_stderr": 0.013674572131693888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
edbeeching/prj_gia_dataset_atari_2B_atari_tutankham_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_tutankham environment, sample for the policy atari_2B_atari_tutankham_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
Seongill/trivia | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 603223350
num_examples: 87622
- name: test
num_bytes: 77956872
num_examples: 11313
download_size: 403718789
dataset_size: 681180222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_qblocks__zephyr_7b_norobots | ---
pretty_name: Evaluation run of qblocks/zephyr_7b_norobots
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [qblocks/zephyr_7b_norobots](https://huggingface.co/qblocks/zephyr_7b_norobots)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__zephyr_7b_norobots\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T22:10:51.334218](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__zephyr_7b_norobots/blob/main/results_2023-12-02T22-10-51.334218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.20621683093252463,\n\
\ \"acc_stderr\": 0.011144364089781441\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.20621683093252463,\n \"acc_stderr\": 0.011144364089781441\n\
\ }\n}\n```"
repo_url: https://huggingface.co/qblocks/zephyr_7b_norobots
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T22_10_51.334218
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-10-51.334218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-10-51.334218.parquet'
- config_name: results
data_files:
- split: 2023_12_02T22_10_51.334218
path:
- results_2023-12-02T22-10-51.334218.parquet
- split: latest
path:
- results_2023-12-02T22-10-51.334218.parquet
---
# Dataset Card for Evaluation run of qblocks/zephyr_7b_norobots
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/qblocks/zephyr_7b_norobots
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [qblocks/zephyr_7b_norobots](https://huggingface.co/qblocks/zephyr_7b_norobots) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__zephyr_7b_norobots",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T22:10:51.334218](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__zephyr_7b_norobots/blob/main/results_2023-12-02T22-10-51.334218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.20621683093252463,
"acc_stderr": 0.011144364089781441
},
"harness|gsm8k|5": {
"acc": 0.20621683093252463,
"acc_stderr": 0.011144364089781441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
one-sec-cv12/chunk_118 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 26944633584.375
num_examples: 280533
download_size: 24953577185
dataset_size: 26944633584.375
---
# Dataset Card for "chunk_118"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/audio_configs2 | ---
configs_kwargs:
- config_name: v1
data_dir: v1
drop_labels: true
- config_name: v2
data_dir: v2
drop_labels: false
duplicated_from: polinaeterna/audio_configs
---
|
mariosasko/test1 | ---
dataset_info:
features:
- name: a
dtype: int64
- name: b
dtype: string
- name: c
dtype: bool
splits:
- name: train
num_bytes: 49
num_examples: 3
download_size: 1536
dataset_size: 49
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marianna13/litarch | ---
license: cc
language:
- en
---
Textbooks from [PubChem Literature Archive](https://ftp.ncbi.nlm.nih.gov/pub/litarch/).
# Image-Text Pairs
```
[
['litarch_figures/ca/84/gene_NBK1116/angelmanF1.jpg', '\nIndividuals depicted have a genetically confirmed diagnosis of Angelman syndrome. Happy expression and an unstable gait accompanied by uplifted arms are commonly observed. At times, the facial appearance can suggest the diagnosis, but usually facial features are not distinctive.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF2.jpg', '\nSchematic drawing of chromosome region 15q11.2-q13 indicating the breakpoint regions BP1-BP6. Low copy repeat elements are located within these breakpoint regions (see text for details). Approximately 90% of chromosome deletions resulting in Angelman syndrome initiate at BP1 or BP2 and terminate in region BP3 (class I and class II). Approximately 10% of deletions are larger, typically spanning from BP1 to BP5, rarely beyond BP5. Genes that are not imprinted and thus biparentally expressed are noted by the open circles. The two critical imprinting center (IC) elements, the AS-SRO and the PWS-SRO, are drawn as open boxes. The gene SNRUF-SNRPN, drawn as a shaded box, has some overlap with the PWS-SRO. The SNURF-SNRPN sense/UBE3A antisense transcript is labeled UBE3A-AS.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF3.jpg', '\nThe pedigree illustrates imprinting inheritance in Angelman syndrome (AS). Inheritance of a deleterious UBE3A pathogenic variant from the male (top left, I-1) has no effect on the two children (II-2, II-4) who inherit his pathogenic variant because the mutated UBE3A has already been inactivated in his germ cells (i.e., by imprinting) and because each of these children also inherited a normally activated UBE3A from their mother (I-2). (Note: Only one active UBE3A allele is required for normal brain functioning.) If his carrier daughter (II-2) transmits the UBE3A pathogenic variant to the grandson and granddaughter (III-1, III-2), they both will have AS since each will have also inherited an inactivated UBE3A from their father; thus, neither child will express a UBE3A allele. The same explanation pertains for AS occurring in the great grand-niece (bottom right, IV-2).\n', '']
]
```
# Interleaved:
```
[
["Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen."],
["The leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss."],
["litarch_figures/df/45/coffeebrk_NBK2345/A559.jpg",
"\nProtein coding genes distribution map for Mycobacterium leprae.\nThe leprosy bacillus genome contains numerous examples of gene deletion and decay. The relative locations of various genes in the genome are depicted in the map above. Protein coding genes are color coded in the map according to their classification within clusters of orthologous groups (COGs) functional categories. COGs represent proteins or groups of paralogs that are found in at least 3 phylogenetically-distant genomes. For more information about COGs, see Science 1997 Oct 24:278(5338):631-7.\n\n",
""],
["Protein coding genes distribution map for Mycobacterium leprae."]
]
```
# Text
```
"Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen.\nThe leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss... "
``` |
beyonddata/mywitch | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 498141.0
num_examples: 11
download_size: 499720
dataset_size: 498141.0
---
# Dataset Card for "mywitch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenDFM/MoCon | ---
license: cc-by-nc-sa-4.0
tags:
- GUI
pretty_name: MoCon
viewer: False
---
# MoGUI😈 and MoCon🛡️
<div align="center">
📃 [Paper](./MoGUI_Paper_v0.1.pdf) | 🛡️ [MoCon Data](https://huggingface.co/datasets/OpenDFM/MoCon) | 😈 [MoGUI Data](https://huggingface.co/datasets/OpenDFM/MoGUI)
[简体中文](./README_zh.md) | English
</div>
## 🔥 News
- **[Cooming Soon]** We will release the complete technical report soon.
- **[2024.3.1]** We have released [MoCon🛡️ data](https://huggingface.co/datasets/OpenDFM/MoCon).
- **[2024.2.29]** We have released [MoGUI😈 data](https://huggingface.co/datasets/OpenDFM/MoGUI) and [pre-release paper](./MoGUI_Paper_v0.1.pdf).
## 📑 Citation
If you find our work useful, please cite us!
```
@misc{zhu2024mogui,
title={Technical Report of MoGUI and MoCon},
author={Zichen Zhu and Liangtai Sun and Danyang Zhang and Ziyuan Li and Guangpeng Li and Lu Chen and Kai Yu},
year={2024},
howpublished={\url{https://huggingface.co/datasets/OpenDFM/MoGUI}}
}
@inproceedings{sun2022meta,
title={META-GUI: Towards Multi-modal Conversational Agents on Mobile GUI},
author={Sun, Liangtai and Chen, Xingyu and Chen, Lu and Dai, Tianle and Zhu, Zichen and Yu, Kai},
booktitle={Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing},
pages={6699--6712},
year={2022}
}
@inproceedings{zhu2023cam,
title={CAM-GUI: A Conversational Assistant on Mobile GUI},
author={Zhu, Zichen and Sun, Liangtai and Yang, Jingkai and Peng, Yifan and Zou, Weilin and Li, Ziyuan and Li, Wutao and Chen, Lu and Ma, Yingzi and Zhang, Danyang and others},
booktitle={National Conference on Man-Machine Speech Communication},
pages={302--315},
year={2023},
organization={Springer}
}
```
## 📧 Contact Us
If you have any questions, please feel free to contact us via email `JamesZhutheThird@sjtu.edu.cn` and `zhang-dy20@sjtu.edu.cn` |
huggingartists/lovv66 | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/lovv66"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.290425 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/73c061dff4e60a751b35fda72ecb6781.881x881x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/lovv66">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">LOVV66</div>
<a href="https://genius.com/artists/lovv66">
<div style="text-align: center; font-size: 14px;">@lovv66</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/lovv66).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lovv66")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|119| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/lovv66")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Freira/Hunting3 | ---
license: openrail
---
|
ineoApp/ds_fact_99 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': numero facture
'2': Telephone
'3': Email
'4': Site web
'5': RC
'6': CNSS
'7': TP
'8': Mode de paiement
'9': fournisseur
'10': date facture
'11': date limite
'12': montant ht
'13': montant ttc
'14': tva
'15': prix tva
'16': addresse
'17': reference
'18': Devise
'19': ICE fournisseur
'20': IF fournisseur
'21': Condition de paiement
'22': informations
'23': art1 designation
'24': art1 quantite
'25': art1 prix unit
'26': art1 tva
'27': art1 montant ht
'28': art1 Article
'29': art1 taux de remise
'30': art2 designation
'31': art2 quantite
'32': art2 prix unit
'33': art2 tva
'34': art2 montant ht
'35': art2 Article
'36': art2 taux de remise
'37': art3 designation
'38': art3 quantite
'39': art3 prix unit
'40': art3 tva
'41': art3 montant ht
'42': art3 Article
'43': art3 taux de remise
'44': art4 designation
'45': art4 quantite
'46': art4 prix unit
'47': art4 tva
'48': art4 montant ht
'49': art4 Article
'50': art4 taux de remise
'51': art5 designation
'52': art5 quantite
'53': art5 prix unit
'54': art5 tva
'55': art5 montant ht
'56': art5 Article
'57': art5 taux de remise
'58': art6 designation
'59': art6 quantite
'60': art6 prix unit
'61': art6 tva
'62': art6 montant ht
'63': art6 Article
'64': art6 taux de remise
'65': art7 designation
'66': art7 quantite
'67': art7 prix unit
'68': art7 tva
'69': art7 montant ht
'70': art7 Article
'71': art7 taux de remise
'72': art8 designation
'73': art8 quantite
'74': art8 prix unit
'75': art8 tva
'76': art8 montant ht
'77': art8 Article
'78': art8 taux de remise
'79': art9 designation
'80': art9 quantite
'81': art9 prix unit
'82': art9 tva
'83': art9 montant ht
'84': art9 Article
'85': art9 taux de remise
'86': art10 designation
'87': art10 quantite
'88': art10 prix unit
'89': art10 tva
'90': art10 montant ht
'91': art10 Article
'92': art10 taux de remise
'93': art11 designation
'94': art11 quantite
'95': art11 prix unit
'96': art11 tva
'97': art11 montant ht
'98': art11 Article
'99': art11 taux de remise
'100': art12 designation
'101': art12 quantite
'102': art12 prix unit
'103': art12 tva
'104': art12 montant ht
'105': art12 Article
'106': art12 taux de remise
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 152365915.2
num_examples: 112
- name: test
num_bytes: 38091478.8
num_examples: 28
download_size: 176601547
dataset_size: 190457394.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Andyrasika/prompt_qa | ---
dataset_info:
features:
- name: question
dtype: string
- name: prompt
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 294966.75
num_examples: 423
- name: test
num_bytes: 98322.25
num_examples: 141
download_size: 213420
dataset_size: 393289
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: creativeml-openrail-m
language:
- en
tags:
- dialogue
---
# Dataset Card for "prompt_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Paul/hatecheck-german | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- de
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: German HateCheck
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
---
# Dataset Card for Multilingual HateCheck
## Dataset Description
Multilingual HateCheck (MHC) is a suite of functional tests for hate speech detection models in 10 different languages: Arabic, Dutch, French, German, Hindi, Italian, Mandarin, Polish, Portuguese and Spanish.
For each language, there are 25+ functional tests that correspond to distinct types of hate and challenging non-hate.
This allows for targeted diagnostic insights into model performance.
For more details, please refer to our paper about MHC, published at the 2022 Workshop on Online Abuse and Harms (WOAH) at NAACL 2022. If you are using MHC, please cite our work!
- **Paper:** Röttger et al. (2022) - Multilingual HateCheck: Functional Tests for Multilingual Hate Speech Detection Models. https://arxiv.org/abs/2206.09917
- **Repository:** https://github.com/rewire-online/multilingual-hatecheck
- **Point of Contact:** paul@rewire.online
## Dataset Structure
The csv format mostly matches the original HateCheck data, with some adjustments for specific languages.
**mhc_case_id**
The test case ID that is unique to each test case across languages (e.g., "mandarin-1305")
**functionality**
The shorthand for the functionality tested by the test case (e.g, "target_obj_nh"). The same functionalities are tested in all languages, except for Mandarin and Arabic, where non-Latin script required adapting the tests for spelling variations.
**test_case**
The test case text.
**label_gold**
The gold standard label ("hateful" or "non-hateful") of the test case. All test cases within a given functionality have the same gold standard label.
**target_ident**
Where applicable, the protected group that is targeted or referenced in the test case. All HateChecks cover seven target groups, but their composition varies across languages.
**ref_case_id**
For hateful cases, where applicable, the ID of the hateful case which was perturbed to generate this test case. For non-hateful cases, where applicable, the ID of the hateful case which is contrasted by this test case.
**ref_templ_id**
The equivalent to ref_case_id, but for template IDs.
**templ_id**
The ID of the template from which the test case was generated.
**case_templ**
The template from which the test case was generated (where applicable).
**gender_male** and **gender_female**
For gender-inflected languages (French, Spanish, Portuguese, Hindi, Arabic, Italian, Polish, German), only for cases where gender inflection is relevant, separate entries for gender_male and gender_female replace case_templ.
**label_annotated**
A list of labels given by the three annotators who reviewed the test case (e.g., "['hateful', 'hateful', 'hateful']").
**label_annotated_maj**
The majority vote of the three annotators (e.g., "hateful"). In some cases this differs from the gold label given by our language experts.
**disagreement_in_case**
True if label_annotated_maj does not match label_gold for the entry.
**disagreement_in_template**
True if the test case is generated from an IDENT template and there is at least one case with disagreement_in_case generated from the same template. This can be used to exclude entire templates from MHC. |
pintolandia/zoro | ---
license: openrail
---
|
seansullivan/Cone-documents | ---
license: unknown
---
|
hunggggg/aligment-handbook-format-intel-orca-dpo-pairs | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 42576577.4042305
num_examples: 11573
- name: test
num_bytes: 4731139.5957695
num_examples: 1286
download_size: 24315227
dataset_size: 47307717.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "aligment-handbook-format-intel-orca-dpo-pairs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nicky0007/cointelegraph_noticias_Es | ---
task_categories:
- token-classification
- question-answering
language:
- es
size_categories:
- 10K<n<100K
---
# Dataset cointelegraph español
Dataset Description
es un dataset donde se recopila informacion del titulo , descripcion , autor, etc.
tiene aprox: 10738 fila
pagina: https://cointelegraph.com/
categorie: #cryptocurrency, #Bitcoin, #Ethereum ... |
shuyuej/MetaMath_Paraphrase_Answeraug | ---
license: apache-2.0
---
|
bpalacios/News | ---
license: mit
---
|
bigscience-data/roots_fr_wikiversity | ---
language: fr
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_fr_wikiversity
# wikiversity_filtered
- Dataset uid: `wikiversity_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0367 % of total
- 0.1050 % of en
- 0.1178 % of fr
- 0.1231 % of pt
- 0.0072 % of zh
- 0.0393 % of es
- 0.0076 % of ar
- 0.0069 % of indic-hi
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
MrezaPRZ/GPT-fine-tuning-spider | ---
license: apache-2.0
---
|
LFBMS/class_dataset_real2_donut_train_val | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bilanz_h
'1': bilanz_v
'2': guv
'3': kontennachweis_bilanz
'4': kontennachweis_guv
'5': other
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 323252155.2837959
num_examples: 1061
- name: test
num_bytes: 17061376.716204118
num_examples: 56
download_size: 320030509
dataset_size: 340313532.0
---
# Dataset Card for "class_dataset_real2_donut_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt1.3b_10e4 | ---
pretty_name: Evaluation run of BFauber/opt1.3b_10e4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt1.3b_10e4](https://huggingface.co/BFauber/opt1.3b_10e4) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T19:08:04.507559](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e4/blob/main/results_2024-02-02T19-08-04.507559.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.271963602705084,\n\
\ \"acc_stderr\": 0.031002150754247282,\n \"acc_norm\": 0.27398469630191746,\n\
\ \"acc_norm_stderr\": 0.031827939590220955,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3867355581952227,\n\
\ \"mc2_stderr\": 0.014105420630947732\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2687713310580205,\n \"acc_stderr\": 0.012955065963710691,\n\
\ \"acc_norm\": 0.3054607508532423,\n \"acc_norm_stderr\": 0.013460080478002498\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4124676359290978,\n\
\ \"acc_stderr\": 0.004912723848944797,\n \"acc_norm\": 0.5351523600876319,\n\
\ \"acc_norm_stderr\": 0.004977434505403351\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633328,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.29354838709677417,\n\
\ \"acc_stderr\": 0.025906087021319288,\n \"acc_norm\": 0.29354838709677417,\n\
\ \"acc_norm_stderr\": 0.025906087021319288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.0331750593000918,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.0331750593000918\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.034697137917043715,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.034697137917043715\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \
\ \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n\
\ \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n\
\ \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.1940928270042194,\n \"acc_stderr\": 0.025744902532290916,\n\
\ \"acc_norm\": 0.1940928270042194,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11210762331838565,\n\
\ \"acc_stderr\": 0.021174894206346103,\n \"acc_norm\": 0.11210762331838565,\n\
\ \"acc_norm_stderr\": 0.021174894206346103\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.15702479338842976,\n \"acc_stderr\": 0.0332124484254713,\n \"\
acc_norm\": 0.15702479338842976,\n \"acc_norm_stderr\": 0.0332124484254713\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.03562367850095391,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.03562367850095391\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652282,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652282\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n\
\ \"acc_stderr\": 0.014583812465862553,\n \"acc_norm\": 0.210727969348659,\n\
\ \"acc_norm_stderr\": 0.014583812465862553\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2315112540192926,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.2315112540192926,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451145,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451145\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.016819028375736386,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.016819028375736386\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3867355581952227,\n\
\ \"mc2_stderr\": 0.014105420630947732\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5840568271507498,\n \"acc_stderr\": 0.013852485356798262\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt1.3b_10e4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-08-04.507559.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- '**/details_harness|winogrande|5_2024-02-02T19-08-04.507559.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T19-08-04.507559.parquet'
- config_name: results
data_files:
- split: 2024_02_02T19_08_04.507559
path:
- results_2024-02-02T19-08-04.507559.parquet
- split: latest
path:
- results_2024-02-02T19-08-04.507559.parquet
---
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e4](https://huggingface.co/BFauber/opt1.3b_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:08:04.507559](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e4/blob/main/results_2024-02-02T19-08-04.507559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.271963602705084,
"acc_stderr": 0.031002150754247282,
"acc_norm": 0.27398469630191746,
"acc_norm_stderr": 0.031827939590220955,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3867355581952227,
"mc2_stderr": 0.014105420630947732
},
"harness|arc:challenge|25": {
"acc": 0.2687713310580205,
"acc_stderr": 0.012955065963710691,
"acc_norm": 0.3054607508532423,
"acc_norm_stderr": 0.013460080478002498
},
"harness|hellaswag|10": {
"acc": 0.4124676359290978,
"acc_stderr": 0.004912723848944797,
"acc_norm": 0.5351523600876319,
"acc_norm_stderr": 0.004977434505403351
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633328,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.025906087021319288,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.025906087021319288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.0331750593000918,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.0331750593000918
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.034697137917043715,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.034697137917043715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.1940928270042194,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.1940928270042194,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.11210762331838565,
"acc_stderr": 0.021174894206346103,
"acc_norm": 0.11210762331838565,
"acc_norm_stderr": 0.021174894206346103
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.15702479338842976,
"acc_stderr": 0.0332124484254713,
"acc_norm": 0.15702479338842976,
"acc_norm_stderr": 0.0332124484254713
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.03562367850095391,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.03562367850095391
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652282,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652282
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862553,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2315112540192926,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.2315112540192926,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451145,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451145
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.016819028375736386,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.016819028375736386
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3867355581952227,
"mc2_stderr": 0.014105420630947732
},
"harness|winogrande|5": {
"acc": 0.5840568271507498,
"acc_stderr": 0.013852485356798262
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/graf_zeppelin_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of graf_zeppelin/グラーフ・ツェッペリン (Kantai Collection)
This is the dataset of graf_zeppelin/グラーフ・ツェッペリン (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, breasts, sidelocks, large_breasts, hair_between_eyes, blue_eyes, hat, peaked_cap`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 595.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/graf_zeppelin_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 350.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/graf_zeppelin_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1264 | 773.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/graf_zeppelin_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 531.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/graf_zeppelin_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1264 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/graf_zeppelin_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/graf_zeppelin_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_gloves, capelet, looking_at_viewer, military_uniform, miniskirt, necktie, solo, black_pantyhose, pleated_skirt, simple_background, white_background, hand_on_hip, iron_cross |
| 1 | 23 |  |  |  |  |  | 1girl, black_skirt, capelet, iron_cross, military_uniform, miniskirt, necktie, pleated_skirt, solo, black_gloves, black_pantyhose, looking_at_viewer, military_hat, simple_background, white_background, jacket, long_sleeves, cowboy_shot, purple_eyes |
| 2 | 10 |  |  |  |  |  | 1girl, black_gloves, black_pantyhose, capelet, looking_at_viewer, miniskirt, necktie, pleated_skirt, solo, iron_cross, military_uniform, black_skirt, crossed_arms |
| 3 | 11 |  |  |  |  |  | 1girl, black_pantyhose, capelet, looking_at_viewer, solo, uniform, black_gloves, miniskirt, blush |
| 4 | 25 |  |  |  |  |  | 1girl, capelet, military_uniform, necktie, solo, iron_cross, upper_body, looking_at_viewer, simple_background, white_background, black_gloves, long_sleeves, military_hat, blush |
| 5 | 10 |  |  |  |  |  | looking_at_viewer, 1girl, blush, solo, navel, simple_background, white_bikini, cleavage, collarbone, iron_cross, white_background, cowboy_shot, necktie, black_gloves, side-tie_bikini_bottom |
| 6 | 9 |  |  |  |  |  | 1girl, cleavage, navel, solo, black_bikini, collarbone, blush, looking_at_viewer, closed_mouth, grey_eyes, alternate_costume, simple_background, white_background |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, black_gloves, hetero, nipples, penis, bar_censor, huge_breasts, solo_focus, blush, handjob, paizuri, uniform, alternate_breast_size, ass, breasts_out, clothed_female_nude_male, ejaculation, gigantic_breasts, heart-shaped_pupils, large_areolae, purple_eyes, smile, tongue_out |
| 8 | 5 |  |  |  |  |  | 1girl, detached_collar, looking_at_viewer, playboy_bunny, rabbit_ears, rabbit_tail, solo, alternate_costume, black_leotard, fake_animal_ears, simple_background, black_pantyhose, grey_eyes, strapless_leotard, white_background, wrist_cuffs, armpits, ass, blush, cleavage, fake_tail, gloves, highleg_leotard |
| 9 | 8 |  |  |  |  |  | cleavage, dirndl, waist_apron, 1girl, alternate_costume, beer_mug, solo, underbust, blush, holding_cup, iron_cross, collarbone, looking_at_viewer, black_dress, cowboy_shot, puffy_short_sleeves, bangs, necklace, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | capelet | looking_at_viewer | military_uniform | miniskirt | necktie | solo | black_pantyhose | pleated_skirt | simple_background | white_background | hand_on_hip | iron_cross | black_skirt | military_hat | jacket | long_sleeves | cowboy_shot | purple_eyes | crossed_arms | uniform | blush | upper_body | navel | white_bikini | cleavage | collarbone | side-tie_bikini_bottom | black_bikini | closed_mouth | grey_eyes | alternate_costume | 1boy | hetero | nipples | penis | bar_censor | huge_breasts | solo_focus | handjob | paizuri | alternate_breast_size | ass | breasts_out | clothed_female_nude_male | ejaculation | gigantic_breasts | heart-shaped_pupils | large_areolae | smile | tongue_out | detached_collar | playboy_bunny | rabbit_ears | rabbit_tail | black_leotard | fake_animal_ears | strapless_leotard | wrist_cuffs | armpits | fake_tail | gloves | highleg_leotard | dirndl | waist_apron | beer_mug | underbust | holding_cup | black_dress | puffy_short_sleeves | bangs | necklace |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------|:--------------------|:-------------------|:------------|:----------|:-------|:------------------|:----------------|:--------------------|:-------------------|:--------------|:-------------|:--------------|:---------------|:---------|:---------------|:--------------|:--------------|:---------------|:----------|:--------|:-------------|:--------|:---------------|:-----------|:-------------|:-------------------------|:---------------|:---------------|:------------|:--------------------|:-------|:---------|:----------|:--------|:-------------|:---------------|:-------------|:----------|:----------|:------------------------|:------|:--------------|:---------------------------|:--------------|:-------------------|:----------------------|:----------------|:--------|:-------------|:------------------|:----------------|:--------------|:--------------|:----------------|:-------------------|:--------------------|:--------------|:----------|:------------|:---------|:------------------|:---------|:--------------|:-----------|:------------|:--------------|:--------------|:----------------------|:--------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 25 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | | X | | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | X | | X | | | X | X | | | X | X | | X | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | | X | | | X | X | | | | | | | | | | | X | | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | | | | X | X | | X | X | | | | | | | | | | | X | | | | X | | | | | X | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | | | X | | | | X | | | | | | X | | | | | X | | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-33500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 7869184851
num_examples: 1000
download_size: 1409242614
dataset_size: 7869184851
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/wikiclir_ro | ---
pretty_name: '`wikiclir/ro`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/ro`
The `wikiclir/ro` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/ro).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=376,655
- `queries` (i.e., topics); count=199,264
- `qrels`: (relevance assessments); count=451,180
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_ro', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_ro', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_ro', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
64bits/lex_fridman_podcast_for_llm_vicuna | ---
task_categories:
- text-generation
language:
- en
pretty_name: lex-llm
tags:
- transformers
---
# Intro
This dataset represents a compilation of audio-to-text transcripts from the Lex Fridman Podcast. The Lex Fridman Podcast, hosted by AI researcher at MIT, Lex Fridman, is a deep dive into a broad range of topics that touch on science, technology, history, philosophy, and the nature of intelligence, consciousness, love, and power. The guests on the podcast are drawn from a diverse range of fields, providing unique and insightful perspectives on these subjects.
The dataset has been formatted in ShareGPT format for use with conversational large language models (LLMs) like Vicuna, WizardVicuna, etc.
This dataset can be an invaluable resource for training and refining language models, offering a rich source of nuanced, intellectual, and thought-provoking dialogue. Furthermore, the diversity of topics covered provides a broad spectrum of language usage, idiomatic expressions, and subject matter expertise.
### 3 versions
1. _original: original dataset where each item is an entire episode
2. _chunked: chunked dataset where episodes are formated into chunks of approximately 1200 words(roughly < 2048 tokens)
3. _chunked_gpt: change "lex" & "guest" to "human" & "gpt" in _chunked dataset to fit Vicuna training
# What I did
1. Fetch all episode links of Lex Fridman Podcast
2. For each episode, transform the transcript in html to json format (Vicuna ShareGPT format)
3. remove the first few sentences from Lex for each episode to remove the introduction and ads.
# Problems & Concerns
1. These are audio-to-text transcriptions, which contain inaccurate detections
2. Although the speakers are professionals, these are verbal conversations which contain oral languages
3. The dataset may contain ads and personal opinions from Lex Fridman and the speakers
4. more ...
# Next Steps
1. finetune LLaMA, WizardVicuna, Vicuna models using this dataset |
waynehwang/customkopocode_korea | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
swaroopajit/next-dataset | ---
language:
- en
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 29075736.0
num_examples: 100
download_size: 25998995
dataset_size: 29075736.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ariji1/acn_train_Test | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 74288.00649350649
num_examples: 123
- name: test
num_bytes: 18722.993506493505
num_examples: 31
download_size: 49882
dataset_size: 93011.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bartoszmaj/nouns_full | ---
license: openrail
dataset_info:
features:
- name: nouns
sequence: string
splits:
- name: train
num_bytes: 1189290187
num_examples: 4600698
download_size: 339695326
dataset_size: 1189290187
---
|
PerceptionEval/IQTest | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: question_type
dtype: string
- name: image_1
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: explanation
dtype: string
- name: prompt
dtype: string
splits:
- name: val
num_bytes: 5895791.0
num_examples: 150
- name: test
num_bytes: 5967887.0
num_examples: 150
download_size: 11454246
dataset_size: 11863678.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
---
## Source
dataset created from (Practice) Graphical Reasoning Questions for Civil Service Exam in China.
## Task
Identify the one picture that follows the same pattern or rule established by the previous pictures.
## Prompt:
```
Prompt 1:
During the IQ test, you'll be presented with four picture options. Your task is to identify the one picture that follows the same pattern or rule established by the previous pictures. Here are some strategies to help you determine the right choice:
1. Look for patterns related to quantity or numbers that progress from one picture to the next.
2. Check if there is a consistent way the images are rotated or flipped in sequence.
3. Identify a common feature that each successive picture shares with the previous one, while also paying attention to any variations that might indicate a regular progression or change.
Select between the following choices and tell me your answer, (A), (B), (C), or (D)?
(A) ...
(B) ...
(C) ...
(D) ...
Prompt 2:
During the IQ test, you'll be presented with four picture options. Your task involves spatial reasoning: the outer surface of the carton is displayed on the left, and you need to determine which of the following options can be folded into it.
Select between the following choices and tell me your answer, (A), (B), (C), or (D)?
(A) ...
(B) ...
(C) ...
(D) ...
```
|
autoevaluate/autoeval-staging-eval-project-b7ccdeae-8bc5-40c1-85ae-3aef82a8e55e-1917 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
zzzq/TaxoCompl-l2-data | ---
license: openrail
---
|
Cheetor1996/Kana_Kojima | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Kana Kojima** from **Nande Koko ni Sensei ga!?**
- *Trained with anime (full-final-pruned) model.*
- *Works best with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks, and with 0.7+ weights.* |
567-labs/wikipedia-embedding-jina-embeddings-v2-small-en-five-percent | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 6656406423
num_examples: 652156
download_size: 4415285154
dataset_size: 6656406423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b | ---
pretty_name: Evaluation run of KnutJaegersberg/black_goo_recipe_b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/black_goo_recipe_b](https://huggingface.co/KnutJaegersberg/black_goo_recipe_b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T14:56:50.691599](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b/blob/main/results_2023-10-17T14-56-50.691599.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335715,\n \"f1\": 0.05097630033557055,\n\
\ \"f1_stderr\": 0.0013271541576312406,\n \"acc\": 0.3192425320418652,\n\
\ \"acc_stderr\": 0.007133502794987516\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335715,\n\
\ \"f1\": 0.05097630033557055,\n \"f1_stderr\": 0.0013271541576312406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225241\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6377269139700079,\n \"acc_stderr\": 0.013508855476252508\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/black_goo_recipe_b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T14_56_50.691599
path:
- '**/details_harness|drop|3_2023-10-17T14-56-50.691599.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T14-56-50.691599.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T14_56_50.691599
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-56-50.691599.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-56-50.691599.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T14_56_50.691599
path:
- '**/details_harness|winogrande|5_2023-10-17T14-56-50.691599.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T14-56-50.691599.parquet'
- config_name: results
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- results_2023-08-31T14:15:51.764812.parquet
- split: 2023_10_17T14_56_50.691599
path:
- results_2023-10-17T14-56-50.691599.parquet
- split: latest
path:
- results_2023-10-17T14-56-50.691599.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/black_goo_recipe_b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/black_goo_recipe_b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/black_goo_recipe_b](https://huggingface.co/KnutJaegersberg/black_goo_recipe_b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T14:56:50.691599](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b/blob/main/results_2023-10-17T14-56-50.691599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335715,
"f1": 0.05097630033557055,
"f1_stderr": 0.0013271541576312406,
"acc": 0.3192425320418652,
"acc_stderr": 0.007133502794987516
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335715,
"f1": 0.05097630033557055,
"f1_stderr": 0.0013271541576312406
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225241
},
"harness|winogrande|5": {
"acc": 0.6377269139700079,
"acc_stderr": 0.013508855476252508
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
flaviagiammarino/vqa-rad | ---
license: cc0-1.0
task_categories:
- visual-question-answering
language:
- en
paperswithcode_id: vqa-rad
tags:
- medical
pretty_name: VQA-RAD
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 95883938.139
num_examples: 1793
- name: test
num_bytes: 23818877.0
num_examples: 451
download_size: 34496718
dataset_size: 119702815.139
---
# Dataset Card for VQA-RAD
## Dataset Description
VQA-RAD is a dataset of question-answer pairs on radiology images. The dataset is intended to be used for training and testing
Medical Visual Question Answering (VQA) systems. The dataset includes both open-ended questions and binary "yes/no" questions.
The dataset is built from [MedPix](https://medpix.nlm.nih.gov/), which is a free open-access online database of medical images.
The question-answer pairs were manually generated by a team of clinicians.
**Homepage:** [Open Science Framework Homepage](https://osf.io/89kps/)<br>
**Paper:** [A dataset of clinically generated visual questions and answers about radiology images](https://www.nature.com/articles/sdata2018251)<br>
**Leaderboard:** [Papers with Code Leaderboard](https://paperswithcode.com/sota/medical-visual-question-answering-on-vqa-rad)
### Dataset Summary
The dataset was downloaded from the [Open Science Framework Homepage](https://osf.io/89kps/) on June 3, 2023. The dataset contains
2,248 question-answer pairs and 315 images. Out of the 315 images, 314 images are referenced by a question-answer pair, while 1 image
is not used. The training set contains 3 duplicate image-question-answer triplets. The training set also has 1 image-question-answer
triplet in common with the test set. After dropping these 4 image-question-answer triplets from the training set, the dataset contains
2,244 question-answer pairs on 314 images.
#### Supported Tasks and Leaderboards
This dataset has an active leaderboard on [Papers with Code](https://paperswithcode.com/sota/medical-visual-question-answering-on-vqa-rad)
where models are ranked based on three metrics: "Close-ended Accuracy", "Open-ended accuracy" and "Overall accuracy". "Close-ended Accuracy" is
the accuracy of a model's generated answers for the subset of binary "yes/no" questions. "Open-ended accuracy" is the accuracy
of a model's generated answers for the subset of open-ended questions. "Overall accuracy" is the accuracy of a model's generated
answers across all questions.
#### Languages
The question-answer pairs are in English.
## Dataset Structure
### Data Instances
Each instance consists of an image-question-answer triplet.
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=566x555>,
'question': 'are regions of the brain infarcted?',
'answer': 'yes'
}
```
### Data Fields
- `'image'`: the image referenced by the question-answer pair.
- `'question'`: the question about the image.
- `'answer'`: the expected answer.
### Data Splits
The dataset is split into training and test. The split is provided directly by the authors.
| | Training Set | Test Set |
|-------------------------|:------------:|:---------:|
| QAs |1,793 |451 |
| Images |313 |203 |
## Additional Information
### Licensing Information
The authors have released the dataset under the CC0 1.0 Universal License.
### Citation Information
```
@article{lau2018dataset,
title={A dataset of clinically generated visual questions and answers about radiology images},
author={Lau, Jason J and Gayen, Soumya and Ben Abacha, Asma and Demner-Fushman, Dina},
journal={Scientific data},
volume={5},
number={1},
pages={1--10},
year={2018},
publisher={Nature Publishing Group}
}
``` |
heliosprime/twitter_dataset_1712997177 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 12410
num_examples: 26
download_size: 10644
dataset_size: 12410
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712997177"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ayoub999/dataset_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': Ref
'2': NumFa
'3': Fourniss
'4': DateFa
'5': DateLim
'6': TotalHT
'7': TVA
'8': TotalTTc
'9': unitP
'10': Qt
'11': TVAP
'12': descp
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 875976.0
num_examples: 2
- name: test
num_bytes: 1021145.0
num_examples: 1
download_size: 1276358
dataset_size: 1897121.0
---
# Dataset Card for "dataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo16_2_64_mix_50_kl_0.1_prm_160m_thr_1.0_seed_1 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43754721
num_examples: 18928
- name: epoch_1
num_bytes: 43764507
num_examples: 18928
- name: epoch_2
num_bytes: 43702258
num_examples: 18928
- name: epoch_3
num_bytes: 43618198
num_examples: 18928
- name: epoch_4
num_bytes: 43563025
num_examples: 18928
- name: epoch_5
num_bytes: 43522727
num_examples: 18928
- name: epoch_6
num_bytes: 43519820
num_examples: 18928
- name: epoch_7
num_bytes: 43523973
num_examples: 18928
- name: epoch_8
num_bytes: 43515824
num_examples: 18928
- name: epoch_9
num_bytes: 43519081
num_examples: 18928
download_size: 325156504
dataset_size: 436004134
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
reza-alipour/M3TM | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: landmark
dtype: image
- name: captions_eng
sequence: string
- name: captions_all
sequence: string
splits:
- name: m3_train
num_bytes: 7994228058.75
num_examples: 173078
- name: m3_edit
num_bytes: 237830707.125
num_examples: 5141
- name: m3_train_filtered
num_bytes: 1067954822.75
num_examples: 23398
- name: m3_test_filtered
num_bytes: 38910069.0
num_examples: 856
- name: hq_train
num_bytes: 3177593184.375
num_examples: 28495
- name: hq_test
num_bytes: 193838011.25
num_examples: 1498
download_size: 11574986806
dataset_size: 12710354853.25
configs:
- config_name: default
data_files:
- split: m3_train
path: data/m3_train-*
- split: m3_edit
path: data/m3_edit-*
- split: m3_train_filtered
path: data/m3_train_filtered-*
- split: m3_test_filtered
path: data/m3_test_filtered-*
- split: hq_train
path: data/hq_train-*
- split: hq_test
path: data/hq_test-*
---
|
Nexdata/1044_Hours_Minnan_Dialect_Speech_Data_by_Mobile_Phone | ---
license: cc-by-nc-nd-4.0
---
## Description
Hokkien(China) Dialect Scripted Monologue Smartphone speech dataset, collected from monologue based on given prompts, covering short message and other 30+ customer consultation domains. Transcribed with text content, gender, age, accent and other attributes. Our dataset was collected from extensive and diversify speakers(2496 people, which come from Quanzhou, Zhangzhou, Taiwan, Xiamen and other sourthern China districts), geographicly speaking, enhancing model performance in real and complex tasks.nQuality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/50?source=Huggingface
# Specifications
## Format
16kHz, 16bit, wav, mono channel
## Content category
Customer consultation (covering 30+ domains); short message
## Recording condition
Low background noise (indoor)
## Recording device
Smartphone; Android:iOS = 3:1
## Country
China(CHN)
## Language
Hokkien
## Speaker
2,496 people; 55% females; 1,049 speakers are among 21-25 years old; speakers are from QuanZhou, ZhangZhou, TaiWan, XiaMen and other southern China districts
## Features of annotation
Transcription text, gender, age, accent, noise
# Licensing Information
Commercial License
|
frascuchon/stackoverflow_feedback_demo | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for stackoverflow_feedback_demo
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("frascuchon/stackoverflow_feedback_demo")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("frascuchon/stackoverflow_feedback_demo")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| title | Title | text | True | False |
| question | Question | text | True | True |
| answer | Answer | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| title_question_fit | Does the title match the question? | label_selection | True | N/A | ['yes', 'no'] |
| tags | What are the topics mentioned in this question? | multi_label_selection | True | N/A | ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'] |
| answer_quality | Rate the quality of the answer: | rating | True | N/A | [1, 2, 3, 4, 5] |
| new_answer | If needed, correct the answer | text | False | N/A | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"answer": "\u003cp\u003eUnfortunately the only API that isn\u0027t deprecated is located in the ApplicationServices framework, which doesn\u0027t have a bridge support file, and thus isn\u0027t available in the bridge. If you\u0027re wanting to use ctypes, you can use ATSFontGetFileReference after looking up the ATSFontRef.\u003c/p\u003e\r\n\r\n\u003cp\u003eCocoa doesn\u0027t have any native support, at least as of 10.5, for getting the location of a font.\u003c/p\u003e",
"question": "\u003cp\u003eI am using the Photoshop\u0027s javascript API to find the fonts in a given PSD.\u003c/p\u003e\n\n\u003cp\u003eGiven a font name returned by the API, I want to find the actual physical font file that that font name corresponds to on the disc.\u003c/p\u003e\n\n\u003cp\u003eThis is all happening in a python program running on OSX so I guess I\u0027m looking for one of:\u003c/p\u003e\n\n\u003cul\u003e\n\u003cli\u003eSome Photoshop javascript\u003c/li\u003e\n\u003cli\u003eA Python function\u003c/li\u003e\n\u003cli\u003eAn OSX API that I can call from python\u003c/li\u003e\n\u003c/ul\u003e\n",
"title": "How can I find the full path to a font from its display name on a Mac?"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "5a053951-24cd-4c9d-9e0c-8a054b95b812",
"values": {
"answer_quality": {
"value": 1
},
"new_answer": {
"value": "Sample answer"
},
"tags": {
"value": [
"tkinter"
]
},
"title_question_fit": {
"value": "yes"
}
}
}
],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"answer": "\u003cp\u003eUnfortunately the only API that isn\u0027t deprecated is located in the ApplicationServices framework, which doesn\u0027t have a bridge support file, and thus isn\u0027t available in the bridge. If you\u0027re wanting to use ctypes, you can use ATSFontGetFileReference after looking up the ATSFontRef.\u003c/p\u003e\r\n\r\n\u003cp\u003eCocoa doesn\u0027t have any native support, at least as of 10.5, for getting the location of a font.\u003c/p\u003e",
"answer_quality": [
{
"status": "submitted",
"user_id": "5a053951-24cd-4c9d-9e0c-8a054b95b812",
"value": 1
}
],
"answer_quality-suggestion": null,
"answer_quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"metadata": "{}",
"new_answer": [
{
"status": "submitted",
"user_id": "5a053951-24cd-4c9d-9e0c-8a054b95b812",
"value": "Sample answer"
}
],
"new_answer-suggestion": null,
"new_answer-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"question": "\u003cp\u003eI am using the Photoshop\u0027s javascript API to find the fonts in a given PSD.\u003c/p\u003e\n\n\u003cp\u003eGiven a font name returned by the API, I want to find the actual physical font file that that font name corresponds to on the disc.\u003c/p\u003e\n\n\u003cp\u003eThis is all happening in a python program running on OSX so I guess I\u0027m looking for one of:\u003c/p\u003e\n\n\u003cul\u003e\n\u003cli\u003eSome Photoshop javascript\u003c/li\u003e\n\u003cli\u003eA Python function\u003c/li\u003e\n\u003cli\u003eAn OSX API that I can call from python\u003c/li\u003e\n\u003c/ul\u003e\n",
"tags": [
{
"status": "submitted",
"user_id": "5a053951-24cd-4c9d-9e0c-8a054b95b812",
"value": [
"tkinter"
]
}
],
"tags-suggestion": null,
"tags-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"title": "How can I find the full path to a font from its display name on a Mac?",
"title_question_fit": [
{
"status": "submitted",
"user_id": "5a053951-24cd-4c9d-9e0c-8a054b95b812",
"value": "yes"
}
],
"title_question_fit-suggestion": null,
"title_question_fit-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **title** is of type `text`.
* **question** is of type `text`.
* **answer** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **title_question_fit** is of type `label_selection` with the following allowed values ['yes', 'no'].
* **tags** is of type `multi_label_selection` with the following allowed values ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'].
* **answer_quality** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **new_answer** is of type `text`.
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **title_question_fit-suggestion** is of type `label_selection` with the following allowed values ['yes', 'no'].
* (optional) **tags-suggestion** is of type `multi_label_selection` with the following allowed values ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'].
* (optional) **answer_quality-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **new_answer-suggestion** is of type `text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_snnxor_n0_l1_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 123760000
num_examples: 10000
- name: validation
num_bytes: 123760000
num_examples: 10000
- name: test
num_bytes: 123760000
num_examples: 10000
download_size: 183536639
dataset_size: 371280000
---
# Dataset Card for "autotree_snnxor_n0_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huynguyendayrui/ecthr_a | ---
dataset_info:
features:
- name: text
sequence: string
- name: labels
sequence:
class_label:
names:
'0': '2'
'1': '3'
'2': '5'
'3': '6'
'4': '8'
'5': '9'
'6': '10'
'7': '11'
'8': '14'
'9': P1-1
- name: law
sequence: string
splits:
- name: train
num_bytes: 267388077
num_examples: 9000
- name: test
num_bytes: 35341614
num_examples: 1000
- name: validation
num_bytes: 33910427
num_examples: 1000
download_size: 157580405
dataset_size: 336640118
---
# Dataset Card for "ecthr_a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gioforce/mats | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_sst2_present_perfect_for_past | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 13568
num_examples: 88
- name: test
num_bytes: 28719
num_examples: 185
- name: train
num_bytes: 433732
num_examples: 3636
download_size: 259027
dataset_size: 476019
---
# Dataset Card for "MULTI_VALUE_sst2_present_perfect_for_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aninha134114/danna | ---
license: openrail
---
|
dim/HC3_ru_8k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: human_answers
sequence: string
- name: chatgpt_answers
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 44537809.06175479
num_examples: 8000
download_size: 21121279
dataset_size: 44537809.06175479
---
# Dataset Card for "HC3_ru_8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adhisetiawan/bdd10k-bitmasks | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 870529576.0
num_examples: 7000
- name: validation
num_bytes: 159751856.0
num_examples: 1000
download_size: 980250621
dataset_size: 1030281432.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/a_91_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of a_91/A-91/A-91 (Girls' Frontline)
This is the dataset of a_91/A-91/A-91 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, yellow_eyes, hair_between_eyes, mole, mole_under_eye, bangs, large_breasts, hat, medium_breasts, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 29.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 15.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 29.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 24.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 42.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a_91_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, blush, smile, looking_at_viewer, solo, gloves, open_mouth, black_bodysuit, holding, cleavage, drunk |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | gloves | open_mouth | black_bodysuit | holding | cleavage | drunk |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:---------|:-------------|:-----------------|:----------|:-----------|:--------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
louisbrulenaudet/code-juridictions-financieres | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code des juridictions financières
source_datasets:
- original
pretty_name: Code des juridictions financières
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code des juridictions financières, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
open-llm-leaderboard/details_undi95__llama2-to-mistral-diff | ---
pretty_name: Evaluation run of undi95/llama2-to-mistral-diff
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_undi95__llama2-to-mistral-diff\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T09:37:53.083823](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-25T09-37-53.083823.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05605494966442959,\n\
\ \"f1_stderr\": 0.0013169501309663063,\n \"acc\": 0.4076941764856182,\n\
\ \"acc_stderr\": 0.009790166925519655\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n\
\ \"f1\": 0.05605494966442959,\n \"f1_stderr\": 0.0013169501309663063\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \
\ \"acc_stderr\": 0.007257633145486643\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/undi95/llama2-to-mistral-diff
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T07_59_15.869817
path:
- '**/details_harness|drop|3_2023-10-24T07-59-15.869817.parquet'
- split: 2023_10_25T09_37_53.083823
path:
- '**/details_harness|drop|3_2023-10-25T09-37-53.083823.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T09-37-53.083823.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T07_59_15.869817
path:
- '**/details_harness|gsm8k|5_2023-10-24T07-59-15.869817.parquet'
- split: 2023_10_25T09_37_53.083823
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-37-53.083823.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-37-53.083823.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T07_59_15.869817
path:
- '**/details_harness|winogrande|5_2023-10-24T07-59-15.869817.parquet'
- split: 2023_10_25T09_37_53.083823
path:
- '**/details_harness|winogrande|5_2023-10-25T09-37-53.083823.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T09-37-53.083823.parquet'
- config_name: results
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- results_2023-10-10T12-55-48.397880.parquet
- split: 2023_10_24T07_59_15.869817
path:
- results_2023-10-24T07-59-15.869817.parquet
- split: 2023_10_25T09_37_53.083823
path:
- results_2023-10-25T09-37-53.083823.parquet
- split: latest
path:
- results_2023-10-25T09-37-53.083823.parquet
---
# Dataset Card for Evaluation run of undi95/llama2-to-mistral-diff
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/undi95/llama2-to-mistral-diff
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_undi95__llama2-to-mistral-diff",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T09:37:53.083823](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-25T09-37-53.083823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05605494966442959,
"f1_stderr": 0.0013169501309663063,
"acc": 0.4076941764856182,
"acc_stderr": 0.009790166925519655
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05605494966442959,
"f1_stderr": 0.0013169501309663063
},
"harness|gsm8k|5": {
"acc": 0.07505686125852919,
"acc_stderr": 0.007257633145486643
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_75 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1235283556
num_examples: 242593
download_size: 1259370393
dataset_size: 1235283556
---
# Dataset Card for "chunk_75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PhilSch/Testging | ---
license: unknown
---
|
nishanthc/dnd_map_dataset_v0.1 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: picture_url
dtype: string
- name: picture_text
dtype: string
- name: data_source
dtype: string
- name: licence
dtype: string
- name: image
dtype: image
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 13579897320.531
num_examples: 4377
download_size: 13512723789
dataset_size: 13579897320.531
---
# Dataset Card for "dnd_map_dataset_v0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
confit/librispeech | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker_id
dtype: string
- name: label
dtype:
class_label:
names:
'0': '100'
'1': '1001'
'2': '1006'
'3': '101'
'4': '1012'
'5': '1018'
'6': '102'
'7': '1025'
'8': '1027'
'9': '1028'
'10': '103'
'11': '1031'
'12': '1034'
'13': '104'
'14': '1040'
'15': '1046'
'16': '1049'
'17': '1050'
'18': '1051'
'19': '1052'
'20': '1053'
'21': '1054'
'22': '1058'
'23': '1060'
'24': '1061'
'25': '1065'
'26': '1066'
'27': '1069'
'28': '107'
'29': '1079'
'30': '1081'
'31': '1084'
'32': '1085'
'33': '1088'
'34': '1089'
'35': '1092'
'36': '1093'
'37': '1094'
'38': '1096'
'39': '1097'
'40': '1098'
'41': '110'
'42': '1100'
'43': '1107'
'44': '111'
'45': '1110'
'46': '1112'
'47': '1116'
'48': '112'
'49': '1121'
'50': '1124'
'51': '1132'
'52': '114'
'53': '115'
'54': '1152'
'55': '1154'
'56': '116'
'57': '1160'
'58': '1161'
'59': '1165'
'60': '1166'
'61': '1168'
'62': '1171'
'63': '1175'
'64': '1179'
'65': '118'
'66': '1182'
'67': '1183'
'68': '1184'
'69': '1187'
'70': '1188'
'71': '119'
'72': '1195'
'73': '1200'
'74': '121'
'75': '1212'
'76': '122'
'77': '1221'
'78': '1222'
'79': '1224'
'80': '1225'
'81': '1226'
'82': '123'
'83': '1230'
'84': '1235'
'85': '1239'
'86': '1241'
'87': '1246'
'88': '125'
'89': '1250'
'90': '1252'
'91': '1255'
'92': '1258'
'93': '1259'
'94': '126'
'95': '1260'
'96': '1261'
'97': '1263'
'98': '1264'
'99': '1265'
'100': '1266'
'101': '127'
'102': '1271'
'103': '1272'
'104': '1274'
'105': '128'
'106': '1280'
'107': '1283'
'108': '1284'
'109': '1289'
'110': '1290'
'111': '1291'
'112': '1296'
'113': '1298'
'114': '1311'
'115': '1313'
'116': '1316'
'117': '1320'
'118': '1322'
'119': '1323'
'120': '133'
'121': '1331'
'122': '1334'
'123': '1335'
'124': '1336'
'125': '1337'
'126': '1341'
'127': '1342'
'128': '1343'
'129': '1347'
'130': '1348'
'131': '1349'
'132': '1353'
'133': '1355'
'134': '1363'
'135': '1365'
'136': '1367'
'137': '1370'
'138': '1373'
'139': '1374'
'140': '1379'
'141': '1382'
'142': '1383'
'143': '1384'
'144': '1387'
'145': '1390'
'146': '1392'
'147': '14'
'148': '1401'
'149': '1403'
'150': '1413'
'151': '1414'
'152': '1417'
'153': '1421'
'154': '1422'
'155': '1425'
'156': '1430'
'157': '1444'
'158': '1445'
'159': '1446'
'160': '1447'
'161': '1448'
'162': '1455'
'163': '1456'
'164': '1460'
'165': '1462'
'166': '1463'
'167': '1469'
'168': '147'
'169': '1472'
'170': '1473'
'171': '1474'
'172': '1482'
'173': '1485'
'174': '1487'
'175': '149'
'176': '1492'
'177': '1494'
'178': '1495'
'179': '1498'
'180': '150'
'181': '1502'
'182': '1505'
'183': '1509'
'184': '151'
'185': '1513'
'186': '152'
'187': '153'
'188': '1535'
'189': '1536'
'190': '154'
'191': '1544'
'192': '1545'
'193': '1547'
'194': '1552'
'195': '1553'
'196': '1556'
'197': '1559'
'198': '1563'
'199': '1564'
'200': '1566'
'201': '1569'
'202': '157'
'203': '1571'
'204': '1572'
'205': '1578'
'206': '1579'
'207': '1580'
'208': '1585'
'209': '159'
'210': '1593'
'211': '1594'
'212': '1595'
'213': '16'
'214': '1601'
'215': '1603'
'216': '1607'
'217': '161'
'218': '1614'
'219': '1618'
'220': '1621'
'221': '1624'
'222': '1629'
'223': '163'
'224': '1630'
'225': '1633'
'226': '1634'
'227': '1636'
'228': '1638'
'229': '1639'
'230': '1641'
'231': '1643'
'232': '1645'
'233': '1646'
'234': '1647'
'235': '1648'
'236': '1649'
'237': '1650'
'238': '1651'
'239': '1653'
'240': '166'
'241': '1664'
'242': '1665'
'243': '1668'
'244': '167'
'245': '1673'
'246': '1674'
'247': '1678'
'248': '1679'
'249': '168'
'250': '1680'
'251': '1681'
'252': '1685'
'253': '1686'
'254': '1688'
'255': '1690'
'256': '1691'
'257': '1693'
'258': '1695'
'259': '1696'
'260': '1699'
'261': '17'
'262': '1701'
'263': '1704'
'264': '1705'
'265': '1708'
'266': '1710'
'267': '1714'
'268': '1715'
'269': '1717'
'270': '1721'
'271': '1723'
'272': '1724'
'273': '1726'
'274': '173'
'275': '1731'
'276': '1733'
'277': '1734'
'278': '1736'
'279': '1737'
'280': '174'
'281': '1740'
'282': '1743'
'283': '1746'
'284': '1748'
'285': '175'
'286': '1750'
'287': '1752'
'288': '1754'
'289': '1756'
'290': '1757'
'291': '176'
'292': '1760'
'293': '1765'
'294': '1767'
'295': '1769'
'296': '177'
'297': '1772'
'298': '1773'
'299': '1776'
'300': '1777'
'301': '1779'
'302': '1780'
'303': '1784'
'304': '1789'
'305': '1795'
'306': '1800'
'307': '1801'
'308': '1804'
'309': '1806'
'310': '1809'
'311': '1811'
'312': '1813'
'313': '1815'
'314': '1819'
'315': '1825'
'316': '1826'
'317': '1827'
'318': '1828'
'319': '1841'
'320': '1844'
'321': '1845'
'322': '1846'
'323': '1849'
'324': '1851'
'325': '1859'
'326': '1863'
'327': '1867'
'328': '1868'
'329': '1870'
'330': '1874'
'331': '1878'
'332': '188'
'333': '1885'
'334': '1898'
'335': '19'
'336': '1901'
'337': '1903'
'338': '1913'
'339': '1914'
'340': '1919'
'341': '192'
'342': '1920'
'343': '1923'
'344': '1924'
'345': '1926'
'346': '1931'
'347': '1933'
'348': '1938'
'349': '1943'
'350': '1944'
'351': '1958'
'352': '196'
'353': '1961'
'354': '1963'
'355': '1968'
'356': '1970'
'357': '1974'
'358': '1977'
'359': '198'
'360': '1985'
'361': '1987'
'362': '1988'
'363': '1989'
'364': '199'
'365': '1992'
'366': '1993'
'367': '1995'
'368': '1998'
'369': '20'
'370': '200'
'371': '2001'
'372': '2002'
'373': '2003'
'374': '2004'
'375': '2007'
'376': '201'
'377': '2010'
'378': '2012'
'379': '2013'
'380': '202'
'381': '2021'
'382': '2026'
'383': '203'
'384': '2033'
'385': '2035'
'386': '2039'
'387': '204'
'388': '2042'
'389': '2045'
'390': '2046'
'391': '205'
'392': '2050'
'393': '2051'
'394': '2053'
'395': '2056'
'396': '2060'
'397': '2061'
'398': '2062'
'399': '2063'
'400': '2067'
'401': '2068'
'402': '207'
'403': '2074'
'404': '2078'
'405': '208'
'406': '2085'
'407': '2086'
'408': '2089'
'409': '209'
'410': '2090'
'411': '2092'
'412': '2093'
'413': '2094'
'414': '2096'
'415': '210'
'416': '2100'
'417': '2104'
'418': '211'
'419': '2110'
'420': '2113'
'421': '2122'
'422': '2127'
'423': '2133'
'424': '2136'
'425': '2137'
'426': '2140'
'427': '2143'
'428': '2146'
'429': '2148'
'430': '2149'
'431': '215'
'432': '2152'
'433': '2156'
'434': '2159'
'435': '216'
'436': '2162'
'437': '2167'
'438': '217'
'439': '218'
'440': '2182'
'441': '2185'
'442': '2194'
'443': '2195'
'444': '2196'
'445': '2198'
'446': '22'
'447': '2201'
'448': '2204'
'449': '2208'
'450': '2229'
'451': '2230'
'452': '2234'
'453': '2237'
'454': '2238'
'455': '224'
'456': '2240'
'457': '2246'
'458': '225'
'459': '2254'
'460': '2256'
'461': '226'
'462': '2262'
'463': '2269'
'464': '227'
'465': '2270'
'466': '2272'
'467': '2273'
'468': '2275'
'469': '2276'
'470': '2277'
'471': '2279'
'472': '228'
'473': '2284'
'474': '2285'
'475': '2288'
'476': '2289'
'477': '229'
'478': '2292'
'479': '2294'
'480': '2297'
'481': '2299'
'482': '23'
'483': '2300'
'484': '2301'
'485': '2309'
'486': '231'
'487': '2312'
'488': '2319'
'489': '233'
'490': '2334'
'491': '2339'
'492': '2341'
'493': '2346'
'494': '2348'
'495': '2351'
'496': '2356'
'497': '2361'
'498': '2364'
'499': '2368'
'500': '237'
'501': '2374'
'502': '238'
'503': '2380'
'504': '2384'
'505': '2388'
'506': '2391'
'507': '2393'
'508': '2397'
'509': '240'
'510': '2401'
'511': '2404'
'512': '2405'
'513': '2407'
'514': '2411'
'515': '2412'
'516': '2414'
'517': '2416'
'518': '242'
'519': '2427'
'520': '2428'
'521': '243'
'522': '2436'
'523': '2437'
'524': '2445'
'525': '2448'
'526': '245'
'527': '246'
'528': '2473'
'529': '248'
'530': '2481'
'531': '2485'
'532': '2487'
'533': '2488'
'534': '249'
'535': '2491'
'536': '2494'
'537': '2496'
'538': '2498'
'539': '2499'
'540': '25'
'541': '250'
'542': '2504'
'543': '2506'
'544': '251'
'545': '2512'
'546': '2514'
'547': '2517'
'548': '2518'
'549': '252'
'550': '2522'
'551': '2526'
'552': '253'
'553': '2531'
'554': '2532'
'555': '2533'
'556': '254'
'557': '2541'
'558': '2544'
'559': '2545'
'560': '255'
'561': '2552'
'562': '2553'
'563': '2562'
'564': '2568'
'565': '2570'
'566': '2573'
'567': '2574'
'568': '2577'
'569': '258'
'570': '2581'
'571': '2582'
'572': '2587'
'573': '2588'
'574': '2589'
'575': '2592'
'576': '2598'
'577': '26'
'578': '260'
'579': '2606'
'580': '2607'
'581': '2609'
'582': '2618'
'583': '2624'
'584': '2625'
'585': '2628'
'586': '263'
'587': '2638'
'588': '264'
'589': '265'
'590': '2652'
'591': '2654'
'592': '2660'
'593': '2671'
'594': '2673'
'595': '2674'
'596': '2676'
'597': '2688'
'598': '2691'
'599': '2694'
'600': '2696'
'601': '27'
'602': '2709'
'603': '2712'
'604': '272'
'605': '2724'
'606': '273'
'607': '2730'
'608': '2733'
'609': '2735'
'610': '274'
'611': '2740'
'612': '2741'
'613': '2748'
'614': '2751'
'615': '2754'
'616': '2758'
'617': '2762'
'618': '2764'
'619': '2769'
'620': '277'
'621': '2774'
'622': '2775'
'623': '278'
'624': '2785'
'625': '2787'
'626': '2790'
'627': '2792'
'628': '28'
'629': '2803'
'630': '2812'
'631': '2815'
'632': '2816'
'633': '2817'
'634': '2823'
'635': '2825'
'636': '2827'
'637': '283'
'638': '2830'
'639': '2834'
'640': '2836'
'641': '2843'
'642': '2853'
'643': '2854'
'644': '288'
'645': '2882'
'646': '289'
'647': '2893'
'648': '2895'
'649': '29'
'650': '2902'
'651': '2909'
'652': '2910'
'653': '2911'
'654': '2919'
'655': '2920'
'656': '2925'
'657': '2929'
'658': '2930'
'659': '294'
'660': '2943'
'661': '2946'
'662': '2952'
'663': '296'
'664': '2960'
'665': '2961'
'666': '2967'
'667': '2971'
'668': '2975'
'669': '2979'
'670': '298'
'671': '2985'
'672': '2988'
'673': '2989'
'674': '2990'
'675': '2992'
'676': '2997'
'677': '2998'
'678': '2999'
'679': '30'
'680': '3000'
'681': '3001'
'682': '3003'
'683': '3005'
'684': '3006'
'685': '3008'
'686': '3009'
'687': '302'
'688': '3020'
'689': '3021'
'690': '3025'
'691': '303'
'692': '3032'
'693': '3033'
'694': '3045'
'695': '3046'
'696': '3053'
'697': '3054'
'698': '3060'
'699': '3063'
'700': '307'
'701': '3070'
'702': '3072'
'703': '3079'
'704': '3080'
'705': '3081'
'706': '3082'
'707': '3083'
'708': '3088'
'709': '3090'
'710': '3092'
'711': '3094'
'712': '3097'
'713': '3098'
'714': '31'
'715': '310'
'716': '3100'
'717': '3105'
'718': '3109'
'719': '311'
'720': '3112'
'721': '3114'
'722': '3118'
'723': '3119'
'724': '3125'
'725': '313'
'726': '3132'
'727': '3135'
'728': '3137'
'729': '3138'
'730': '3142'
'731': '3143'
'732': '3144'
'733': '3148'
'734': '3157'
'735': '3168'
'736': '317'
'737': '3170'
'738': '3171'
'739': '3172'
'740': '3179'
'741': '318'
'742': '3180'
'743': '3185'
'744': '3187'
'745': '319'
'746': '3192'
'747': '3196'
'748': '32'
'749': '3214'
'750': '3215'
'751': '322'
'752': '3221'
'753': '3224'
'754': '3227'
'755': '3228'
'756': '323'
'757': '3230'
'758': '3235'
'759': '3238'
'760': '3240'
'761': '3242'
'762': '3244'
'763': '3245'
'764': '3257'
'765': '3258'
'766': '3259'
'767': '3261'
'768': '3268'
'769': '3271'
'770': '3272'
'771': '3274'
'772': '328'
'773': '3285'
'774': '3288'
'775': '3289'
'776': '329'
'777': '3290'
'778': '3294'
'779': '3307'
'780': '331'
'781': '3314'
'782': '3318'
'783': '3319'
'784': '332'
'785': '3328'
'786': '3330'
'787': '3331'
'788': '3334'
'789': '3340'
'790': '3346'
'791': '3347'
'792': '335'
'793': '3356'
'794': '3357'
'795': '336'
'796': '3361'
'797': '3368'
'798': '337'
'799': '3370'
'800': '3373'
'801': '3374'
'802': '3379'
'803': '3380'
'804': '3381'
'805': '3389'
'806': '339'
'807': '3394'
'808': '340'
'809': '3400'
'810': '3409'
'811': '3411'
'812': '3417'
'813': '3433'
'814': '3436'
'815': '3440'
'816': '3446'
'817': '3448'
'818': '345'
'819': '3465'
'820': '3467'
'821': '3470'
'822': '3479'
'823': '348'
'824': '3482'
'825': '3483'
'826': '3486'
'827': '3488'
'828': '3490'
'829': '3493'
'830': '3500'
'831': '3503'
'832': '3513'
'833': '3521'
'834': '3526'
'835': '3528'
'836': '353'
'837': '3536'
'838': '3537'
'839': '3538'
'840': '3540'
'841': '3541'
'842': '3546'
'843': '3547'
'844': '3549'
'845': '3551'
'846': '3553'
'847': '3554'
'848': '3557'
'849': '3559'
'850': '3564'
'851': '3567'
'852': '3570'
'853': '3571'
'854': '3575'
'855': '3576'
'856': '3584'
'857': '3587'
'858': '3588'
'859': '359'
'860': '3592'
'861': '3595'
'862': '3598'
'863': '36'
'864': '3606'
'865': '3607'
'866': '3615'
'867': '3618'
'868': '362'
'869': '3630'
'870': '3638'
'871': '3641'
'872': '3645'
'873': '3647'
'874': '365'
'875': '3650'
'876': '3654'
'877': '3656'
'878': '3657'
'879': '366'
'880': '3660'
'881': '3663'
'882': '3664'
'883': '3665'
'884': '367'
'885': '3675'
'886': '3679'
'887': '3681'
'888': '3686'
'889': '369'
'890': '3691'
'891': '3698'
'892': '3699'
'893': '37'
'894': '3703'
'895': '3717'
'896': '3723'
'897': '3728'
'898': '3729'
'899': '373'
'900': '3733'
'901': '3738'
'902': '374'
'903': '3744'
'904': '3747'
'905': '3752'
'906': '3757'
'907': '3764'
'908': '377'
'909': '3779'
'910': '3780'
'911': '3781'
'912': '3783'
'913': '3790'
'914': '3792'
'915': '3793'
'916': '3796'
'917': '3798'
'918': '38'
'919': '380'
'920': '3807'
'921': '3816'
'922': '3819'
'923': '3825'
'924': '3830'
'925': '3835'
'926': '3843'
'927': '3845'
'928': '3848'
'929': '3851'
'930': '3852'
'931': '3853'
'932': '3857'
'933': '3864'
'934': '3866'
'935': '3867'
'936': '3869'
'937': '3871'
'938': '3876'
'939': '3879'
'940': '3885'
'941': '3889'
'942': '3894'
'943': '3895'
'944': '3896'
'945': '39'
'946': '3905'
'947': '3906'
'948': '3909'
'949': '3911'
'950': '3912'
'951': '3914'
'952': '3915'
'953': '392'
'954': '3922'
'955': '3923'
'956': '3925'
'957': '3926'
'958': '3927'
'959': '3928'
'960': '3934'
'961': '3945'
'962': '3947'
'963': '3955'
'964': '3959'
'965': '3962'
'966': '3967'
'967': '3969'
'968': '3972'
'969': '3977'
'970': '3979'
'971': '398'
'972': '3982'
'973': '3983'
'974': '3989'
'975': '3990'
'976': '3992'
'977': '3994'
'978': '3997'
'979': '40'
'980': '4005'
'981': '4009'
'982': '4010'
'983': '4013'
'984': '4014'
'985': '4015'
'986': '4017'
'987': '4018'
'988': '4019'
'989': '402'
'990': '4020'
'991': '4021'
'992': '403'
'993': '4034'
'994': '4039'
'995': '404'
'996': '4042'
'997': '4044'
'998': '405'
'999': '4051'
'1000': '4054'
'1001': '4057'
'1002': '4059'
'1003': '4063'
'1004': '4064'
'1005': '4071'
'1006': '4077'
'1007': '4078'
'1008': '408'
'1009': '4085'
'1010': '4088'
'1011': '409'
'1012': '4090'
'1013': '4098'
'1014': '4104'
'1015': '4108'
'1016': '4110'
'1017': '4111'
'1018': '4116'
'1019': '412'
'1020': '4122'
'1021': '413'
'1022': '4133'
'1023': '4137'
'1024': '4138'
'1025': '4145'
'1026': '4148'
'1027': '4152'
'1028': '4153'
'1029': '4156'
'1030': '4160'
'1031': '4161'
'1032': '4172'
'1033': '4174'
'1034': '4179'
'1035': '4189'
'1036': '4191'
'1037': '4192'
'1038': '4193'
'1039': '4195'
'1040': '4196'
'1041': '4198'
'1042': '4205'
'1043': '421'
'1044': '4211'
'1045': '4214'
'1046': '4216'
'1047': '4217'
'1048': '4218'
'1049': '422'
'1050': '4222'
'1051': '4225'
'1052': '4226'
'1053': '4234'
'1054': '4235'
'1055': '4236'
'1056': '4238'
'1057': '4243'
'1058': '4246'
'1059': '4257'
'1060': '426'
'1061': '4260'
'1062': '4262'
'1063': '4263'
'1064': '4267'
'1065': '4273'
'1066': '4277'
'1067': '4278'
'1068': '428'
'1069': '4280'
'1070': '4289'
'1071': '4290'
'1072': '4294'
'1073': '4295'
'1074': '4297'
'1075': '4305'
'1076': '4310'
'1077': '4313'
'1078': '432'
'1079': '4321'
'1080': '4323'
'1081': '4327'
'1082': '4331'
'1083': '4335'
'1084': '434'
'1085': '4340'
'1086': '4344'
'1087': '4345'
'1088': '4350'
'1089': '4352'
'1090': '4356'
'1091': '4358'
'1092': '4362'
'1093': '4363'
'1094': '4379'
'1095': '4381'
'1096': '439'
'1097': '4396'
'1098': '4397'
'1099': '44'
'1100': '4402'
'1101': '4406'
'1102': '4407'
'1103': '441'
'1104': '4411'
'1105': '4415'
'1106': '4420'
'1107': '4422'
'1108': '4423'
'1109': '4425'
'1110': '4427'
'1111': '4428'
'1112': '4433'
'1113': '4434'
'1114': '4438'
'1115': '444'
'1116': '4441'
'1117': '4442'
'1118': '4443'
'1119': '4446'
'1120': '4447'
'1121': '445'
'1122': '4455'
'1123': '446'
'1124': '4463'
'1125': '4474'
'1126': '448'
'1127': '4480'
'1128': '4481'
'1129': '4484'
'1130': '4487'
'1131': '4490'
'1132': '4492'
'1133': '4495'
'1134': '45'
'1135': '4507'
'1136': '451'
'1137': '4511'
'1138': '4513'
'1139': '4515'
'1140': '4519'
'1141': '4520'
'1142': '453'
'1143': '4535'
'1144': '454'
'1145': '4545'
'1146': '4546'
'1147': '4549'
'1148': '4563'
'1149': '4570'
'1150': '4572'
'1151': '4576'
'1152': '458'
'1153': '4583'
'1154': '4586'
'1155': '459'
'1156': '4590'
'1157': '4591'
'1158': '4592'
'1159': '4594'
'1160': '4595'
'1161': '4598'
'1162': '4599'
'1163': '46'
'1164': '460'
'1165': '4629'
'1166': '464'
'1167': '4640'
'1168': '4652'
'1169': '4659'
'1170': '466'
'1171': '4660'
'1172': '4667'
'1173': '4680'
'1174': '4681'
'1175': '4687'
'1176': '4693'
'1177': '4697'
'1178': '4699'
'1179': '47'
'1180': '470'
'1181': '4701'
'1182': '4703'
'1183': '4705'
'1184': '4706'
'1185': '4710'
'1186': '4712'
'1187': '4719'
'1188': '472'
'1189': '4731'
'1190': '4733'
'1191': '4734'
'1192': '4738'
'1193': '474'
'1194': '4741'
'1195': '4742'
'1196': '4744'
'1197': '4748'
'1198': '475'
'1199': '4750'
'1200': '4757'
'1201': '476'
'1202': '4766'
'1203': '4767'
'1204': '4770'
'1205': '4771'
'1206': '4773'
'1207': '4779'
'1208': '4788'
'1209': '479'
'1210': '4791'
'1211': '4799'
'1212': '480'
'1213': '4800'
'1214': '4806'
'1215': '4807'
'1216': '481'
'1217': '4813'
'1218': '4821'
'1219': '4824'
'1220': '483'
'1221': '4830'
'1222': '4831'
'1223': '4836'
'1224': '4837'
'1225': '4839'
'1226': '4841'
'1227': '4846'
'1228': '4848'
'1229': '4852'
'1230': '4853'
'1231': '4854'
'1232': '4856'
'1233': '4859'
'1234': '4860'
'1235': '4863'
'1236': '487'
'1237': '4872'
'1238': '489'
'1239': '4894'
'1240': '4898'
'1241': '4899'
'1242': '49'
'1243': '4910'
'1244': '4915'
'1245': '492'
'1246': '4926'
'1247': '4930'
'1248': '4931'
'1249': '4936'
'1250': '4945'
'1251': '4948'
'1252': '4955'
'1253': '4957'
'1254': '4958'
'1255': '4959'
'1256': '4964'
'1257': '4965'
'1258': '4967'
'1259': '4969'
'1260': '497'
'1261': '4970'
'1262': '4973'
'1263': '4979'
'1264': '4991'
'1265': '4992'
'1266': '4993'
'1267': '500'
'1268': '5000'
'1269': '5002'
'1270': '5005'
'1271': '5007'
'1272': '5009'
'1273': '501'
'1274': '5012'
'1275': '5013'
'1276': '5019'
'1277': '5022'
'1278': '5023'
'1279': '5029'
'1280': '5036'
'1281': '5038'
'1282': '5039'
'1283': '5043'
'1284': '5044'
'1285': '5045'
'1286': '5049'
'1287': '505'
'1288': '5054'
'1289': '5060'
'1290': '5062'
'1291': '5063'
'1292': '5076'
'1293': '5077'
'1294': '5082'
'1295': '5092'
'1296': '5093'
'1297': '51'
'1298': '510'
'1299': '5101'
'1300': '5104'
'1301': '5105'
'1302': '511'
'1303': '5115'
'1304': '5118'
'1305': '512'
'1306': '5123'
'1307': '5126'
'1308': '5132'
'1309': '5133'
'1310': '5136'
'1311': '5139'
'1312': '5141'
'1313': '5142'
'1314': '5147'
'1315': '5152'
'1316': '5154'
'1317': '5157'
'1318': '5163'
'1319': '5164'
'1320': '517'
'1321': '5172'
'1322': '5181'
'1323': '5183'
'1324': '5185'
'1325': '5186'
'1326': '5189'
'1327': '5190'
'1328': '5192'
'1329': '5198'
'1330': '5199'
'1331': '52'
'1332': '5206'
'1333': '5217'
'1334': '5220'
'1335': '5224'
'1336': '5230'
'1337': '5233'
'1338': '5239'
'1339': '5242'
'1340': '5244'
'1341': '5245'
'1342': '5246'
'1343': '5248'
'1344': '525'
'1345': '5252'
'1346': '5261'
'1347': '5266'
'1348': '5269'
'1349': '5271'
'1350': '5278'
'1351': '5280'
'1352': '5285'
'1353': '5287'
'1354': '5290'
'1355': '5293'
'1356': '5296'
'1357': '5299'
'1358': '5304'
'1359': '5319'
'1360': '5321'
'1361': '5322'
'1362': '5325'
'1363': '5328'
'1364': '533'
'1365': '5333'
'1366': '5337'
'1367': '5338'
'1368': '5339'
'1369': '534'
'1370': '5340'
'1371': '5350'
'1372': '5355'
'1373': '5361'
'1374': '5375'
'1375': '5379'
'1376': '5386'
'1377': '5389'
'1378': '5390'
'1379': '5393'
'1380': '54'
'1381': '5400'
'1382': '5401'
'1383': '5405'
'1384': '5412'
'1385': '542'
'1386': '5424'
'1387': '5429'
'1388': '543'
'1389': '5439'
'1390': '544'
'1391': '5442'
'1392': '5445'
'1393': '5448'
'1394': '5456'
'1395': '5459'
'1396': '5460'
'1397': '5463'
'1398': '5468'
'1399': '5471'
'1400': '548'
'1401': '5480'
'1402': '5484'
'1403': '5487'
'1404': '5489'
'1405': '549'
'1406': '55'
'1407': '5506'
'1408': '551'
'1409': '5513'
'1410': '5514'
'1411': '5519'
'1412': '5536'
'1413': '5538'
'1414': '5543'
'1415': '5545'
'1416': '5561'
'1417': '5565'
'1418': '5567'
'1419': '5569'
'1420': '557'
'1421': '5570'
'1422': '5583'
'1423': '5588'
'1424': '559'
'1425': '56'
'1426': '5604'
'1427': '5606'
'1428': '561'
'1429': '5618'
'1430': '5620'
'1431': '5622'
'1432': '5628'
'1433': '5635'
'1434': '5636'
'1435': '5637'
'1436': '5639'
'1437': '5641'
'1438': '5649'
'1439': '5652'
'1440': '5653'
'1441': '5655'
'1442': '5656'
'1443': '5660'
'1444': '5661'
'1445': '5665'
'1446': '567'
'1447': '5671'
'1448': '5672'
'1449': '5678'
'1450': '568'
'1451': '5682'
'1452': '5683'
'1453': '5684'
'1454': '5688'
'1455': '569'
'1456': '5694'
'1457': '57'
'1458': '5700'
'1459': '5703'
'1460': '5712'
'1461': '5717'
'1462': '5719'
'1463': '572'
'1464': '5720'
'1465': '5723'
'1466': '5724'
'1467': '5725'
'1468': '5727'
'1469': '5731'
'1470': '5733'
'1471': '5735'
'1472': '5740'
'1473': '5746'
'1474': '5750'
'1475': '5756'
'1476': '576'
'1477': '5764'
'1478': '5765'
'1479': '5767'
'1480': '5772'
'1481': '5776'
'1482': '5778'
'1483': '5781'
'1484': '5784'
'1485': '5789'
'1486': '5791'
'1487': '5796'
'1488': '58'
'1489': '580'
'1490': '5802'
'1491': '5808'
'1492': '5809'
'1493': '581'
'1494': '5810'
'1495': '5825'
'1496': '5826'
'1497': '583'
'1498': '5831'
'1499': '5837'
'1500': '584'
'1501': '5840'
'1502': '5849'
'1503': '585'
'1504': '5854'
'1505': '5860'
'1506': '5867'
'1507': '5868'
'1508': '587'
'1509': '5874'
'1510': '5876'
'1511': '5883'
'1512': '5886'
'1513': '589'
'1514': '5890'
'1515': '5893'
'1516': '5894'
'1517': '5895'
'1518': '5906'
'1519': '5909'
'1520': '5910'
'1521': '5911'
'1522': '5913'
'1523': '5914'
'1524': '5918'
'1525': '5929'
'1526': '593'
'1527': '5933'
'1528': '5935'
'1529': '594'
'1530': '5940'
'1531': '5949'
'1532': '5951'
'1533': '5952'
'1534': '596'
'1535': '5968'
'1536': '597'
'1537': '5970'
'1538': '5975'
'1539': '5977'
'1540': '5979'
'1541': '598'
'1542': '5980'
'1543': '5983'
'1544': '5984'
'1545': '5985'
'1546': '5993'
'1547': '60'
'1548': '6000'
'1549': '6003'
'1550': '6006'
'1551': '6009'
'1552': '6010'
'1553': '6014'
'1554': '6019'
'1555': '6025'
'1556': '6030'
'1557': '6032'
'1558': '6035'
'1559': '6037'
'1560': '6038'
'1561': '6051'
'1562': '6054'
'1563': '606'
'1564': '6060'
'1565': '6064'
'1566': '6065'
'1567': '6070'
'1568': '6072'
'1569': '6075'
'1570': '6076'
'1571': '6077'
'1572': '6078'
'1573': '608'
'1574': '6080'
'1575': '6081'
'1576': '6082'
'1577': '6084'
'1578': '6087'
'1579': '6088'
'1580': '6097'
'1581': '6098'
'1582': '6099'
'1583': '61'
'1584': '6102'
'1585': '6104'
'1586': '6106'
'1587': '6111'
'1588': '6115'
'1589': '6119'
'1590': '612'
'1591': '6120'
'1592': '6121'
'1593': '6123'
'1594': '6126'
'1595': '6127'
'1596': '6128'
'1597': '613'
'1598': '6131'
'1599': '6135'
'1600': '6138'
'1601': '6139'
'1602': '614'
'1603': '6145'
'1604': '6147'
'1605': '6153'
'1606': '6157'
'1607': '6159'
'1608': '6160'
'1609': '6167'
'1610': '6173'
'1611': '6177'
'1612': '6178'
'1613': '6181'
'1614': '6184'
'1615': '6188'
'1616': '6189'
'1617': '6196'
'1618': '6199'
'1619': '62'
'1620': '6206'
'1621': '6209'
'1622': '6211'
'1623': '6215'
'1624': '622'
'1625': '6221'
'1626': '6224'
'1627': '6227'
'1628': '6232'
'1629': '6233'
'1630': '6235'
'1631': '6236'
'1632': '6241'
'1633': '6242'
'1634': '6248'
'1635': '6249'
'1636': '625'
'1637': '6251'
'1638': '6254'
'1639': '6258'
'1640': '6267'
'1641': '6269'
'1642': '6272'
'1643': '6276'
'1644': '6281'
'1645': '6284'
'1646': '6286'
'1647': '6288'
'1648': '6294'
'1649': '6295'
'1650': '6300'
'1651': '6308'
'1652': '6311'
'1653': '6313'
'1654': '6317'
'1655': '6319'
'1656': '6323'
'1657': '6324'
'1658': '6330'
'1659': '6332'
'1660': '6333'
'1661': '6339'
'1662': '6341'
'1663': '6345'
'1664': '6346'
'1665': '6351'
'1666': '6352'
'1667': '6353'
'1668': '6356'
'1669': '6358'
'1670': '6359'
'1671': '636'
'1672': '6364'
'1673': '6367'
'1674': '6368'
'1675': '637'
'1676': '6370'
'1677': '6371'
'1678': '6373'
'1679': '6377'
'1680': '6378'
'1681': '6385'
'1682': '6388'
'1683': '639'
'1684': '6391'
'1685': '6395'
'1686': '6399'
'1687': '64'
'1688': '6402'
'1689': '6406'
'1690': '6407'
'1691': '6411'
'1692': '6415'
'1693': '6418'
'1694': '6426'
'1695': '6432'
'1696': '6436'
'1697': '6437'
'1698': '644'
'1699': '6446'
'1700': '6454'
'1701': '6455'
'1702': '6458'
'1703': '6459'
'1704': '6467'
'1705': '6476'
'1706': '6482'
'1707': '6484'
'1708': '6488'
'1709': '6492'
'1710': '6494'
'1711': '6497'
'1712': '6499'
'1713': '65'
'1714': '6505'
'1715': '6506'
'1716': '6509'
'1717': '6510'
'1718': '6512'
'1719': '6513'
'1720': '6518'
'1721': '6519'
'1722': '652'
'1723': '6529'
'1724': '6531'
'1725': '6533'
'1726': '6534'
'1727': '6535'
'1728': '6538'
'1729': '6539'
'1730': '6540'
'1731': '6544'
'1732': '6548'
'1733': '6549'
'1734': '6550'
'1735': '6553'
'1736': '6555'
'1737': '6557'
'1738': '6563'
'1739': '6567'
'1740': '6568'
'1741': '6574'
'1742': '6575'
'1743': '6583'
'1744': '6590'
'1745': '6594'
'1746': '6599'
'1747': '66'
'1748': '6609'
'1749': '6610'
'1750': '6614'
'1751': '6620'
'1752': '6625'
'1753': '6627'
'1754': '663'
'1755': '6636'
'1756': '6637'
'1757': '664'
'1758': '6641'
'1759': '6643'
'1760': '6652'
'1761': '666'
'1762': '6660'
'1763': '6668'
'1764': '667'
'1765': '6670'
'1766': '6673'
'1767': '6674'
'1768': '6676'
'1769': '6683'
'1770': '6685'
'1771': '6686'
'1772': '6687'
'1773': '6689'
'1774': '669'
'1775': '6690'
'1776': '6694'
'1777': '6695'
'1778': '6696'
'1779': '6701'
'1780': '6705'
'1781': '6707'
'1782': '6709'
'1783': '671'
'1784': '6713'
'1785': '672'
'1786': '6724'
'1787': '6726'
'1788': '6727'
'1789': '6733'
'1790': '6735'
'1791': '6741'
'1792': '6743'
'1793': '6746'
'1794': '6747'
'1795': '6749'
'1796': '6752'
'1797': '6753'
'1798': '6754'
'1799': '6758'
'1800': '6763'
'1801': '6773'
'1802': '6777'
'1803': '6782'
'1804': '6784'
'1805': '6788'
'1806': '679'
'1807': '6792'
'1808': '6794'
'1809': '6798'
'1810': '6804'
'1811': '6807'
'1812': '681'
'1813': '6818'
'1814': '6821'
'1815': '6828'
'1816': '6829'
'1817': '6836'
'1818': '684'
'1819': '6841'
'1820': '6846'
'1821': '6848'
'1822': '6849'
'1823': '6853'
'1824': '6865'
'1825': '6875'
'1826': '6877'
'1827': '688'
'1828': '6880'
'1829': '6882'
'1830': '6883'
'1831': '6892'
'1832': '6895'
'1833': '690'
'1834': '6902'
'1835': '6904'
'1836': '6906'
'1837': '6912'
'1838': '6913'
'1839': '6914'
'1840': '6918'
'1841': '6923'
'1842': '6924'
'1843': '6925'
'1844': '6927'
'1845': '6930'
'1846': '6937'
'1847': '6938'
'1848': '6943'
'1849': '6945'
'1850': '6947'
'1851': '6950'
'1852': '6951'
'1853': '6954'
'1854': '6956'
'1855': '696'
'1856': '6962'
'1857': '6963'
'1858': '6965'
'1859': '6967'
'1860': '6974'
'1861': '6978'
'1862': '698'
'1863': '6981'
'1864': '699'
'1865': '6993'
'1866': '70'
'1867': '700'
'1868': '7000'
'1869': '7001'
'1870': '7008'
'1871': '7009'
'1872': '7011'
'1873': '7012'
'1874': '7018'
'1875': '7021'
'1876': '7026'
'1877': '7030'
'1878': '7046'
'1879': '705'
'1880': '7051'
'1881': '7055'
'1882': '7059'
'1883': '7061'
'1884': '7062'
'1885': '7065'
'1886': '7067'
'1887': '7069'
'1888': '707'
'1889': '7073'
'1890': '7078'
'1891': '7079'
'1892': '708'
'1893': '7085'
'1894': '7090'
'1895': '7092'
'1896': '7095'
'1897': '7096'
'1898': '7097'
'1899': '7105'
'1900': '7107'
'1901': '711'
'1902': '7113'
'1903': '7117'
'1904': '712'
'1905': '7120'
'1906': '7121'
'1907': '7125'
'1908': '7126'
'1909': '7127'
'1910': '7128'
'1911': '713'
'1912': '7131'
'1913': '7134'
'1914': '7135'
'1915': '7138'
'1916': '7139'
'1917': '7140'
'1918': '7143'
'1919': '7145'
'1920': '7147'
'1921': '7148'
'1922': '7150'
'1923': '7155'
'1924': '716'
'1925': '7169'
'1926': '7170'
'1927': '7176'
'1928': '7177'
'1929': '7178'
'1930': '718'
'1931': '7188'
'1932': '7189'
'1933': '7190'
'1934': '7197'
'1935': '7198'
'1936': '7199'
'1937': '720'
'1938': '7205'
'1939': '7208'
'1940': '7215'
'1941': '7218'
'1942': '7220'
'1943': '7223'
'1944': '7226'
'1945': '7228'
'1946': '7229'
'1947': '7238'
'1948': '7239'
'1949': '724'
'1950': '7240'
'1951': '7241'
'1952': '7242'
'1953': '7245'
'1954': '7246'
'1955': '7247'
'1956': '7250'
'1957': '7255'
'1958': '7258'
'1959': '726'
'1960': '7263'
'1961': '7264'
'1962': '7265'
'1963': '727'
'1964': '7276'
'1965': '7277'
'1966': '7278'
'1967': '728'
'1968': '7285'
'1969': '7286'
'1970': '7294'
'1971': '7297'
'1972': '7299'
'1973': '730'
'1974': '7301'
'1975': '7302'
'1976': '7307'
'1977': '731'
'1978': '7312'
'1979': '7313'
'1980': '7314'
'1981': '7315'
'1982': '7316'
'1983': '7318'
'1984': '7320'
'1985': '7326'
'1986': '7327'
'1987': '7331'
'1988': '7333'
'1989': '7335'
'1990': '7337'
'1991': '7338'
'1992': '7339'
'1993': '7342'
'1994': '7346'
'1995': '7348'
'1996': '7354'
'1997': '7357'
'1998': '7360'
'1999': '7367'
'2000': '737'
'2001': '7376'
'2002': '7383'
'2003': '7384'
'2004': '7387'
'2005': '7389'
'2006': '7391'
'2007': '7392'
'2008': '7395'
'2009': '7398'
'2010': '7402'
'2011': '7408'
'2012': '7416'
'2013': '742'
'2014': '7423'
'2015': '7424'
'2016': '7433'
'2017': '7434'
'2018': '7436'
'2019': '7437'
'2020': '7445'
'2021': '7447'
'2022': '7448'
'2023': '7460'
'2024': '7463'
'2025': '7467'
'2026': '7475'
'2027': '7478'
'2028': '7480'
'2029': '7481'
'2030': '7484'
'2031': '7491'
'2032': '7492'
'2033': '7495'
'2034': '7498'
'2035': '75'
'2036': '7502'
'2037': '7505'
'2038': '7507'
'2039': '7510'
'2040': '7511'
'2041': '7512'
'2042': '7514'
'2043': '7515'
'2044': '7517'
'2045': '7518'
'2046': '7520'
'2047': '7522'
'2048': '7525'
'2049': '753'
'2050': '7538'
'2051': '7540'
'2052': '7552'
'2053': '7553'
'2054': '7555'
'2055': '7556'
'2056': '7558'
'2057': '7559'
'2058': '7561'
'2059': '7565'
'2060': '7569'
'2061': '7584'
'2062': '7585'
'2063': '7594'
'2064': '7597'
'2065': '7601'
'2066': '7603'
'2067': '7607'
'2068': '7608'
'2069': '7609'
'2070': '7618'
'2071': '7635'
'2072': '764'
'2073': '7640'
'2074': '7641'
'2075': '7644'
'2076': '7647'
'2077': '7649'
'2078': '7654'
'2079': '7657'
'2080': '766'
'2081': '7665'
'2082': '7672'
'2083': '7679'
'2084': '7683'
'2085': '7687'
'2086': '7688'
'2087': '7691'
'2088': '7697'
'2089': '7699'
'2090': '77'
'2091': '770'
'2092': '7700'
'2093': '7702'
'2094': '7704'
'2095': '7705'
'2096': '7708'
'2097': '7713'
'2098': '7717'
'2099': '7720'
'2100': '7729'
'2101': '7730'
'2102': '7732'
'2103': '7733'
'2104': '7737'
'2105': '7739'
'2106': '774'
'2107': '7746'
'2108': '7749'
'2109': '7752'
'2110': '7754'
'2111': '7756'
'2112': '7762'
'2113': '7764'
'2114': '7766'
'2115': '7769'
'2116': '777'
'2117': '7777'
'2118': '778'
'2119': '7780'
'2120': '7783'
'2121': '7786'
'2122': '7789'
'2123': '779'
'2124': '7794'
'2125': '7795'
'2126': '7796'
'2127': '78'
'2128': '780'
'2129': '7800'
'2130': '7802'
'2131': '7809'
'2132': '781'
'2133': '7816'
'2134': '782'
'2135': '7823'
'2136': '7825'
'2137': '7826'
'2138': '7828'
'2139': '783'
'2140': '7832'
'2141': '7833'
'2142': '7835'
'2143': '7837'
'2144': '7839'
'2145': '7843'
'2146': '7848'
'2147': '7850'
'2148': '7859'
'2149': '7867'
'2150': '7868'
'2151': '7871'
'2152': '7874'
'2153': '7879'
'2154': '7881'
'2155': '7883'
'2156': '7886'
'2157': '789'
'2158': '7892'
'2159': '7898'
'2160': '79'
'2161': '7902'
'2162': '7909'
'2163': '791'
'2164': '7910'
'2165': '7912'
'2166': '792'
'2167': '7923'
'2168': '7925'
'2169': '7926'
'2170': '7932'
'2171': '7933'
'2172': '7938'
'2173': '7939'
'2174': '7942'
'2175': '7945'
'2176': '7946'
'2177': '7949'
'2178': '7956'
'2179': '7957'
'2180': '7959'
'2181': '7962'
'2182': '7967'
'2183': '797'
'2184': '7975'
'2185': '7976'
'2186': '7981'
'2187': '7982'
'2188': '7988'
'2189': '7991'
'2190': '7994'
'2191': '7995'
'2192': '7997'
'2193': '8005'
'2194': '8006'
'2195': '8008'
'2196': '8009'
'2197': '8011'
'2198': '8012'
'2199': '8014'
'2200': '8015'
'2201': '8023'
'2202': '8028'
'2203': '803'
'2204': '8033'
'2205': '8040'
'2206': '8042'
'2207': '8044'
'2208': '8050'
'2209': '8051'
'2210': '8057'
'2211': '8058'
'2212': '806'
'2213': '8063'
'2214': '8066'
'2215': '807'
'2216': '8071'
'2217': '8072'
'2218': '8075'
'2219': '8080'
'2220': '8087'
'2221': '8088'
'2222': '8095'
'2223': '8097'
'2224': '8098'
'2225': '81'
'2226': '810'
'2227': '8108'
'2228': '811'
'2229': '8112'
'2230': '8113'
'2231': '8118'
'2232': '8119'
'2233': '8123'
'2234': '8131'
'2235': '8138'
'2236': '8142'
'2237': '8143'
'2238': '8148'
'2239': '815'
'2240': '8152'
'2241': '8156'
'2242': '816'
'2243': '8163'
'2244': '8164'
'2245': '8168'
'2246': '8169'
'2247': '8172'
'2248': '8173'
'2249': '8176'
'2250': '8180'
'2251': '8183'
'2252': '8188'
'2253': '8190'
'2254': '8193'
'2255': '8194'
'2256': '8195'
'2257': '8197'
'2258': '8199'
'2259': '82'
'2260': '820'
'2261': '8200'
'2262': '8208'
'2263': '8215'
'2264': '8222'
'2265': '8224'
'2266': '8225'
'2267': '8226'
'2268': '8228'
'2269': '8230'
'2270': '8238'
'2271': '8240'
'2272': '8242'
'2273': '8245'
'2274': '8246'
'2275': '8250'
'2276': '8254'
'2277': '8259'
'2278': '826'
'2279': '8262'
'2280': '8266'
'2281': '8272'
'2282': '8273'
'2283': '8280'
'2284': '8288'
'2285': '829'
'2286': '8291'
'2287': '8295'
'2288': '8296'
'2289': '8297'
'2290': '83'
'2291': '830'
'2292': '8300'
'2293': '8302'
'2294': '8307'
'2295': '831'
'2296': '8312'
'2297': '8316'
'2298': '8321'
'2299': '8322'
'2300': '8324'
'2301': '8328'
'2302': '8329'
'2303': '8334'
'2304': '8337'
'2305': '834'
'2306': '8346'
'2307': '8347'
'2308': '835'
'2309': '8356'
'2310': '836'
'2311': '8367'
'2312': '8382'
'2313': '8388'
'2314': '8389'
'2315': '839'
'2316': '8392'
'2317': '8394'
'2318': '8396'
'2319': '84'
'2320': '8401'
'2321': '8404'
'2322': '8410'
'2323': '8413'
'2324': '8414'
'2325': '8415'
'2326': '8419'
'2327': '8421'
'2328': '8422'
'2329': '8424'
'2330': '8425'
'2331': '8430'
'2332': '8432'
'2333': '844'
'2334': '8441'
'2335': '8443'
'2336': '8444'
'2337': '8445'
'2338': '8447'
'2339': '845'
'2340': '8455'
'2341': '8459'
'2342': '846'
'2343': '8461'
'2344': '8463'
'2345': '8464'
'2346': '8465'
'2347': '8466'
'2348': '8468'
'2349': '8470'
'2350': '8474'
'2351': '8476'
'2352': '8479'
'2353': '8490'
'2354': '8494'
'2355': '8498'
'2356': '8499'
'2357': '85'
'2358': '850'
'2359': '8500'
'2360': '8506'
'2361': '851'
'2362': '8527'
'2363': '8531'
'2364': '8534'
'2365': '8536'
'2366': '8543'
'2367': '8544'
'2368': '8545'
'2369': '8555'
'2370': '8565'
'2371': '8573'
'2372': '8575'
'2373': '8576'
'2374': '8580'
'2375': '8587'
'2376': '8590'
'2377': '8591'
'2378': '8592'
'2379': '8605'
'2380': '8609'
'2381': '8619'
'2382': '8625'
'2383': '8629'
'2384': '8630'
'2385': '8631'
'2386': '8632'
'2387': '8635'
'2388': '8643'
'2389': '8644'
'2390': '8664'
'2391': '8666'
'2392': '8671'
'2393': '8675'
'2394': '8677'
'2395': '8678'
'2396': '868'
'2397': '8684'
'2398': '8687'
'2399': '8699'
'2400': '87'
'2401': '8705'
'2402': '8710'
'2403': '8713'
'2404': '8718'
'2405': '8722'
'2406': '8725'
'2407': '8742'
'2408': '8747'
'2409': '8753'
'2410': '8758'
'2411': '876'
'2412': '8765'
'2413': '8770'
'2414': '8771'
'2415': '8772'
'2416': '8776'
'2417': '8778'
'2418': '8786'
'2419': '8791'
'2420': '8797'
'2421': '8799'
'2422': '8803'
'2423': '8808'
'2424': '882'
'2425': '8820'
'2426': '8824'
'2427': '8825'
'2428': '8838'
'2429': '884'
'2430': '8842'
'2431': '8846'
'2432': '8848'
'2433': '8855'
'2434': '886'
'2435': '8867'
'2436': '887'
'2437': '8875'
'2438': '8879'
'2439': '8887'
'2440': '8897'
'2441': '89'
'2442': '895'
'2443': '8975'
'2444': '899'
'2445': '90'
'2446': '9000'
'2447': '9022'
'2448': '9023'
'2449': '9026'
'2450': '908'
'2451': '909'
'2452': '91'
'2453': '911'
'2454': '915'
'2455': '92'
'2456': '920'
'2457': '921'
'2458': '922'
'2459': '923'
'2460': '925'
'2461': '927'
'2462': '93'
'2463': '937'
'2464': '94'
'2465': '948'
'2466': '949'
'2467': '951'
'2468': '953'
'2469': '954'
'2470': '956'
'2471': '957'
'2472': '960'
'2473': '964'
'2474': '968'
'2475': '969'
'2476': '976'
'2477': '978'
'2478': '979'
'2479': '98'
'2480': '982'
'2481': '984'
'2482': '985'
'2483': '986'
splits:
- name: train
num_bytes: 1183811374.891
num_examples: 14481
- name: test
num_bytes: 1456513851.552
num_examples: 7452
download_size: 2639349234
dataset_size: 2640325226.443
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- audio-classification
tags:
- audio
- multiclass
- speaker
- speech
---
# LibriSpeech Speaker Identification
LibriSpeech is a corpus of approximately 1000 hours of 16kHz read English speech, prepared by Vassil Panayotov with the assistance of Daniel Povey.
The data is derived from read audiobooks from the LibriVox project, and has been carefully segmented and aligned.
However, although LibriSpeech is very popular in ASR tasks, we use LibriSpeech database as a speaker identification task.
We follow SincNet paper official split for training and evaluation.
It has 2484 number of classes (unique speakers) with a total of 21933 (14481 / 7452) samples.
## Citation
```bibtex
@misc{ravanelli2019speaker,
title={Speaker Recognition from Raw Waveform with SincNet},
author={Mirco Ravanelli and Yoshua Bengio},
year={2019},
eprint={1808.00158},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
```
```bibtex
@misc{ravanelli2019speaker,
title={Speaker Recognition from Raw Waveform with SincNet},
author={Mirco Ravanelli and Yoshua Bengio},
year={2019},
eprint={1808.00158},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
``` |
trondizzy/uk_en_combined_OPUS_sets | ---
license: cc
task_categories:
- translation
language:
- uk
- en
size_categories:
- 1M<n<10M
--- |
bigscience-data/roots_fr_the_pile_europarl | ---
language: fr
license: mit
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_fr_the_pile_europarl
# the_pile_europarl
- Dataset uid: `the_pile_europarl`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.1278 % of total
- 0.4112 % of fr
- 1.5555 % of pt
- 0.7511 % of es
- 0.1503 % of en
### BigScience processing steps
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
mwong/climatetext-claim-related-evaluation | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
- gpl-3.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|climate_text
task_categories:
- text-classification
task_ids:
- fact-checking
---
### Dataset Summary
This dataset is extracted from Climate Text dataset (https://www.sustainablefinance.uzh.ch/en/research/climate-fever/climatext.html), pre-processed and, ready to evaluate.
The evaluation objective is a text classification task - given a climate related claim and evidence, predict if claim is related to evidence. |
ducha07/vn_news_audio | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: 'Unnamed: 0'
dtype: int64
- name: start_time
dtype: float64
- name: end_time
dtype: float64
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 573339872.698
num_examples: 14907
download_size: 556319244
dataset_size: 573339872.698
---
# Dataset Card for "audio_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-conll2003-2dc2f6d8-11805572 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: AJGP/bert-finetuned-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: AJGP/bert-finetuned-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@hrezaeim](https://huggingface.co/hrezaeim) for evaluating this model. |
leosocy/palmnet | ---
language:
- en
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- image-classification
- image-feature-extraction
- feature-extraction
pretty_name: PalmNet
dataset_info:
- config_name: coep
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 408971428.0
num_examples: 1305
download_size: 398257468
dataset_size: 408971428.0
- config_name: polyu
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 419352000.0
num_examples: 24000
download_size: 396344896
dataset_size: 419352000.0
- config_name: polyu-roi
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 419352000.0
num_examples: 24000
download_size: 396344896
dataset_size: 419352000.0
- config_name: tongji
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 209673800.0
num_examples: 12000
download_size: 192259567
dataset_size: 209673800.0
- config_name: tongji-roi
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 209673800.0
num_examples: 12000
download_size: 192259567
dataset_size: 209673800.0
- config_name: tsinghua
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 62999083.52
num_examples: 1280
download_size: 44953696
dataset_size: 62999083.52
- config_name: tsinghua-roi
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 62999083.52
num_examples: 1280
download_size: 44953696
dataset_size: 62999083.52
configs:
- config_name: coep
data_files:
- split: train
path: subsets/coep/train-*
- config_name: polyu
data_files:
- split: train
path: subsets/polyu/train-*
default: true
- config_name: polyu-roi
data_files:
- split: train
path: subsets/polyu/train-*
- config_name: tongji
data_files:
- split: train
path: subsets/tongji/train-*
- config_name: tongji-roi
data_files:
- split: train
path: subsets/tongji/train-*
- config_name: tsinghua
data_files:
- split: train
path: subsets/tsinghua/train-*
- config_name: tsinghua-roi
data_files:
- split: train
path: subsets/tsinghua/train-*
---
|
1aurent/unsplash-lite-palette | ---
dataset_info:
features:
- name: url
dtype: string
- name: ai_description
dtype: string
- name: palettes
struct:
- name: '1'
dtype:
array2_d:
shape:
- 1
- 3
dtype: uint8
- name: '2'
dtype:
array2_d:
shape:
- 2
- 3
dtype: uint8
- name: '3'
dtype:
array2_d:
shape:
- 3
- 3
dtype: uint8
- name: '4'
dtype:
array2_d:
shape:
- 4
- 3
dtype: uint8
- name: '5'
dtype:
array2_d:
shape:
- 5
- 3
dtype: uint8
- name: '6'
dtype:
array2_d:
shape:
- 6
- 3
dtype: uint8
- name: '7'
dtype:
array2_d:
shape:
- 7
- 3
dtype: uint8
- name: '8'
dtype:
array2_d:
shape:
- 8
- 3
dtype: uint8
splits:
- name: train
num_bytes: 28536733
num_examples: 24998
download_size: 4159745
dataset_size: 28536733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
license_name: unsplash-commercial
license_link: https://github.com/unsplash/datasets/blob/master/DOCS.md
task_categories:
- text-to-image
- image-to-text
language:
- en
tags:
- unsplash
- v1.2.1
pretty_name: Unsplash Lite w/ Palettes
size_categories:
- 10K<n<100K
---
# The Unsplash Lite Dataset (v1.2.1) with color palettes

The Lite dataset contains all of the same fields as the Full dataset, but is limited to ~25,000 photos.
It can be used for both commercial and non-commercial usage, provided you abide by [the terms](https://github.com/unsplash/datasets/blob/master/TERMS.md).
The Unsplash Dataset is made available for research purposes.
[It cannot be used to redistribute the images contained within](https://github.com/unsplash/datasets/blob/master/TERMS.md).
To use the Unsplash library in a product, see [the Unsplash API](https://unsplash.com/developers).
This subset of the dataset contains only urls to the images, their descriptions generated from an AI service, and 8 palettes (generated using [okolors](https://github.com/Ivordir/Okolors)).
To download the images from the urls, you may do something like this:
```python
from datasets import load_dataset, DownloadManager, Image
ds = load_dataset("1aurent/unsplash-lite-palette")
def download_image(url: str | list[str], dl_manager: DownloadManager):
filename = dl_manager.download(url)
return {"image": filename}
ds = ds.map(
function=download_image,
input_columns=["url"],
fn_kwargs={
"dl_manager": DownloadManager(),
},
batched=True,
num_proc=6,
)
ds = ds.cast_column(
column="image",
feature=Image(),
)
```
 |
harpomaxx/unix-commands | ---
license: cc-by-4.0
tags:
- instruction-finetuning
pretty_name: unix-commands-dataset
task_categories:
- text-generation
---
# Unix Commands Dataset
## Description
The Unix Commands Dataset is a unique collection of real-world Unix command line examples, captured from various system prompts representing different user roles and responsibilities, such as system administrators, DevOps, network administrators, Docker administrators, regular users, and hackers.
The dataset consists of Unix commands ranging from basic to advanced levels and from a wide array of categories, including file operations (`ls`, `cd`), system information (`uname`, `top`), network configuration (`ifconfig`, `netstat`), text manipulation (`grep`, `awk`), process management (`ps`, `kill`), package management (`apt-get`, `yum`), and various others. Each command is associated with the expected output to help in understanding the behavior of the command.
The dataset also includes examples related to certain specific roles, like Docker commands for Docker administrators and `iptables` commands for network administrators. This helps in showcasing the diversity of Unix commands in different work contexts.
## Dataset Structure
The dataset is structured following the alpaca format
1. **instruction**: A simple prompt to force the LLM to act as a Unix Terminal. You will probably need to change that.
2. **input**: The command prompt, including the username, hostname, and current directory. Example: `user@webserver:~$` Followed by the Unix command input by the user. Example: `ls /home`
3. **output**: The expected output from the Unix command. Example: `john emily alex`
## Usage
This dataset can be used to fine-tune a language model with a focus on Unix command line usage. This could lead to the development of AI models that can provide real-time assistance on Unix command lines, help in Unix system automation, perform Unix command prediction, or aid in cybersecurity analysis by understanding system logs.
By understanding this dataset, the language model can learn to provide more accurate and contextually appropriate responses when generating text related to Unix systems and command-line interactions.
## Note
The Unix Commands Dataset is not intended to teach Unix system administration or serve as a comprehensive guide to Unix commands. Rather, it provides real-world examples of how commands are used in various contexts, which can be valuable for AI training and natural language processing tasks.
|
open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9 | ---
pretty_name: Evaluation run of jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T03:31:40.171262](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9/blob/main/results_2024-04-07T03-31-40.171262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559143487632242,\n\
\ \"acc_stderr\": 0.03200996549462357,\n \"acc_norm\": 0.6554287821986152,\n\
\ \"acc_norm_stderr\": 0.032675253303136656,\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.7482652601699259,\n\
\ \"mc2_stderr\": 0.014273429873734122\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.01328452529240351,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7230631348336984,\n\
\ \"acc_stderr\": 0.00446570481089354,\n \"acc_norm\": 0.887572196773551,\n\
\ \"acc_norm_stderr\": 0.003152464637757645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486867,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.01275616194252337,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.01275616194252337\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.7482652601699259,\n\
\ \"mc2_stderr\": 0.014273429873734122\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222782\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923647\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|arc:challenge|25_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|gsm8k|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hellaswag|10_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|winogrande|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T03-31-40.171262.parquet'
- config_name: results
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- results_2024-04-07T03-31-40.171262.parquet
- split: latest
path:
- results_2024-04-07T03-31-40.171262.parquet
---
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T03:31:40.171262](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9/blob/main/results_2024-04-07T03-31-40.171262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6559143487632242,
"acc_stderr": 0.03200996549462357,
"acc_norm": 0.6554287821986152,
"acc_norm_stderr": 0.032675253303136656,
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.7482652601699259,
"mc2_stderr": 0.014273429873734122
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.01328452529240351,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7230631348336984,
"acc_stderr": 0.00446570481089354,
"acc_norm": 0.887572196773551,
"acc_norm_stderr": 0.003152464637757645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486867,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.01275616194252337,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.01275616194252337
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.7482652601699259,
"mc2_stderr": 0.014273429873734122
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222782
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_qa_no_id_v5_full_recite_full_passage | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 9247014
num_examples: 5070
- name: validation
num_bytes: 580390
num_examples: 300
download_size: 1781909
dataset_size: 9827404
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_full_passage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_pmlb_phoneme_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 154480000
num_examples: 10000
- name: validation
num_bytes: 154480000
num_examples: 10000
download_size: 67311028
dataset_size: 308960000
---
# Dataset Card for "autotree_pmlb_phoneme_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kuanchy/Kuanchy | ---
license: unknown
---
|
vikp/textbook_gen6 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: eos
dtype: bool
- name: kind
dtype: string
- name: topic
dtype: string
- name: model
dtype: string
- name: combined
dtype: string
splits:
- name: train
num_bytes: 2488746746.5148544
num_examples: 71313
download_size: 1040296902
dataset_size: 2488746746.5148544
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "textbook_gen6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
soddokayo/crime1 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 12070
num_examples: 180
download_size: 0
dataset_size: 12070
---
# Dataset Card for "crime1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Korean_Speech_Data_by_Mobile_Phone_Guiding | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Korean_Speech_Data_by_Mobile_Phone_Guiding
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/61?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 211 Korean locals and is recorded in quiet indoor environment. 99 females, 112 males. Recording devices are mainstream Android phones and iPhones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/61?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Korean
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
BerMaker/afqmc_small | ---
license: apache-2.0
text:
text-classification:
language:
- zh
type:
- binary-class
---
# 概述
AFQMC(Ant Financial Question Matching Corpus)蚂蚁金融语义相似度数据集,用于问题相似度计算。即:给定客服里用户描述的两句话,用算法来判断是否表示了相同的语义
# 数据集描述
本数据集仅包含AFQMC少量样本,供测试使用
### Clone with HTTP
* http://www.modelscope.cn/datasets/modelscope/afqmc_small.git |
Hikam/PreprocessedReviewDataset | ---
license: apache-2.0
---
|
arieg/cluster03_large_150 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '000534'
'1': 000821
'2': '001102'
'3': 005381
'4': 006802
'5': 008345
'6': 008357
'7': 011682
'8': '011776'
'9': 014586
'10': 016994
'11': 020369
'12': '023353'
'13': '026600'
'14': 032338
'15': '036146'
'16': 046928
'17': 048440
'18': 048465
'19': 048931
'20': '050752'
'21': 052389
'22': '052647'
'23': '056523'
'24': 057820
'25': 061492
'26': '062005'
'27': 064840
'28': 066649
'29': '067365'
'30': 067638
'31': 073169
'32': 075936
'33': 084484
'34': 086262
'35': 087192
'36': 087430
'37': 087431
'38': 088486
'39': 090804
'40': 091459
'41': 097373
'42': 097847
'43': '100976'
'44': '104724'
'45': '106937'
'46': '112196'
'47': '114242'
'48': '114942'
'49': '115473'
'50': '116098'
'51': '116237'
'52': '116467'
'53': '116489'
'54': '119896'
'55': '122647'
'56': '122911'
'57': '122936'
'58': '125238'
'59': '125622'
'60': '125825'
'61': '126229'
'62': '126230'
'63': '126283'
'64': '126410'
'65': '127349'
'66': '127498'
'67': '127648'
'68': '142671'
'69': '145609'
'70': '148610'
'71': '153956'
splits:
- name: train
num_bytes: 536712015.6
num_examples: 10800
download_size: 540685130
dataset_size: 536712015.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
linhqyy/data_test_whisper_large_v2_legit | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: pred_str
dtype: string
splits:
- name: train
num_bytes: 174278487.625
num_examples: 1299
download_size: 164191689
dataset_size: 174278487.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_test_whisper_large_v2_legit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Miniex/NateJpvozvocals | ---
license: openrail
---
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a16 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r8_a16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r8_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:30:01.535975](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a16/blob/main/results_2024-02-10T00-30-01.535975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5567620303421783,\n\
\ \"acc_stderr\": 0.03366103654428546,\n \"acc_norm\": 0.5624684566863432,\n\
\ \"acc_norm_stderr\": 0.03438060269276338,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.3795479681605362,\n\
\ \"mc2_stderr\": 0.01379514557538818\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n\
\ \"acc_stderr\": 0.004852658876775391,\n \"acc_norm\": 0.823043218482374,\n\
\ \"acc_norm_stderr\": 0.0038085217687699345\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236397,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.042407993275749255,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.042407993275749255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667765,\n \"\
acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667765\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547815,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547815\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598018,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483727,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483727\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455765,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455765\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n\
\ \"acc_stderr\": 0.015005762446786168,\n \"acc_norm\": 0.27932960893854747,\n\
\ \"acc_norm_stderr\": 0.015005762446786168\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634355,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634355\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087371,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087371\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.3795479681605362,\n\
\ \"mc2_stderr\": 0.01379514557538818\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025397\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24564063684609552,\n \
\ \"acc_stderr\": 0.011857183603902225\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-30-01.535975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-30-01.535975.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- '**/details_harness|winogrande|5_2024-02-10T00-30-01.535975.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-30-01.535975.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_30_01.535975
path:
- results_2024-02-10T00-30-01.535975.parquet
- split: latest
path:
- results_2024-02-10T00-30-01.535975.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r8_a16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r8_a16](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r8_a16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:30:01.535975](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r8_a16/blob/main/results_2024-02-10T00-30-01.535975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5567620303421783,
"acc_stderr": 0.03366103654428546,
"acc_norm": 0.5624684566863432,
"acc_norm_stderr": 0.03438060269276338,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.3795479681605362,
"mc2_stderr": 0.01379514557538818
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.6164110734913364,
"acc_stderr": 0.004852658876775391,
"acc_norm": 0.823043218482374,
"acc_norm_stderr": 0.0038085217687699345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236397,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.042407993275749255,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.042407993275749255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547815,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547815
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483727,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483727
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455765,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786168,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786168
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634355,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634355
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087371,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087371
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.3795479681605362,
"mc2_stderr": 0.01379514557538818
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025397
},
"harness|gsm8k|5": {
"acc": 0.24564063684609552,
"acc_stderr": 0.011857183603902225
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kardosdrur/opensubtitles-no-da | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: link_id
dtype: string
- name: da
dtype: string
- name: 'no'
dtype: string
- name: overlap
dtype: float64
splits:
- name: train
num_bytes: 270499727.08648384
num_examples: 1772983
- name: test
num_bytes: 67624969.91351616
num_examples: 443246
download_size: 201396375
dataset_size: 338124697.0
---
# OpenSubtitles Danish-Norwegian
Aligned sentences with heuristic-based filters from OpenSubtitles in Danish and in Norwegian.
The source code for producing the dataset is included in the repository.
The dataset was created to aid training sentence transformers in the Danish Foundation Models project.
|
Tristan/test_dataset_for_predict | ---
dataset_info:
features:
- name: query
dtype: string
splits:
- name: train
num_bytes: 60
num_examples: 2
download_size: 926
dataset_size: 60
---
# Dataset Card for "test_dataset_for_predict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Raagul04/DocVQA_train | ---
license: afl-3.0
---
|
yuan-sf63/word_label_0.8_72_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
splits:
- name: train
num_bytes: 50161848.00342051
num_examples: 71305
- name: validation
num_bytes: 5573694.996579492
num_examples: 7923
download_size: 9878155
dataset_size: 55735543.0
---
# Dataset Card for "word_label_0.8_72_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/spider-light | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: frame
dtype: int64
splits:
- name: train
num_bytes: 5732946153.5
num_examples: 1332
download_size: 5733102541
dataset_size: 5732946153.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.