datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
M-RKZ/chatbot | ---
task_categories:
- text-generation
language:
- en
pretty_name: chatbot_dataset_1
size_categories:
- n<1K
--- |
vietgpt/vungoi_question_type1 | ---
dataset_info:
features:
- name: metadata
struct:
- name: chapter
dtype: string
- name: difficult_degree
dtype: int64
- name: grade
dtype: string
- name: id
dtype: string
- name: idx
dtype: int64
- name: subject
dtype: string
- name: question
dtype: string
- name: options
list:
- name: answer
dtype: string
- name: key
dtype: string
- name: answer
struct:
- name: answer
dtype: string
- name: key
dtype: string
- name: solution
dtype: string
- name: quality
struct:
- name: has_image
dtype: bool
- name: missing_question
dtype: bool
- name: missing_solution
dtype: bool
splits:
- name: train
num_bytes: 140854723
num_examples: 112042
download_size: 88486050
dataset_size: 140854723
---
# Dataset Card for "vungoi_question_type1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dliu1/legal-llama-instruction1 | ---
license: apache-2.0
---
|
anan-2024/twitter_dataset_1713132870 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 142941
num_examples: 393
download_size: 80266
dataset_size: 142941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HANTIFARAH/ma2 | ---
dataset_info:
config_name: castorini__mr-tydi-corpus__arabic
features:
- name: text
dtype: string
- name: source
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 1365794115
num_examples: 2106586
download_size: 568484163
dataset_size: 1365794115
configs:
- config_name: castorini__mr-tydi-corpus__arabic
data_files:
- split: train
path: castorini__mr-tydi-corpus__arabic/train-*
---
|
open-llm-leaderboard/details_danielpark__gorani-100k-llama2-13b-instruct | ---
pretty_name: Evaluation run of danielpark/gorani-100k-llama2-13b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [danielpark/gorani-100k-llama2-13b-instruct](https://huggingface.co/danielpark/gorani-100k-llama2-13b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_danielpark__gorani-100k-llama2-13b-instruct_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-06T21:31:02.629994](https://huggingface.co/datasets/open-llm-leaderboard/details_danielpark__gorani-100k-llama2-13b-instruct_public/blob/main/results_2023-11-06T21-31-02.629994.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 8.284395973154363e-05,\n \"f1_stderr\"\
: 6.061110851297716e-05,\n \"acc\": 0.24822415153906865,\n \"acc_stderr\"\
: 0.0070260655734579345\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 8.284395973154363e-05,\n \"\
f1_stderr\": 6.061110851297716e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4964483030781373,\n \"acc_stderr\": 0.014052131146915869\n\
\ }\n}\n```"
repo_url: https://huggingface.co/danielpark/gorani-100k-llama2-13b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T02_32_18.553422
path:
- '**/details_harness|drop|3_2023-11-05T02-32-18.553422.parquet'
- split: 2023_11_06T21_31_02.629994
path:
- '**/details_harness|drop|3_2023-11-06T21-31-02.629994.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-06T21-31-02.629994.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T02_32_18.553422
path:
- '**/details_harness|gsm8k|5_2023-11-05T02-32-18.553422.parquet'
- split: 2023_11_06T21_31_02.629994
path:
- '**/details_harness|gsm8k|5_2023-11-06T21-31-02.629994.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-06T21-31-02.629994.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T02_32_18.553422
path:
- '**/details_harness|winogrande|5_2023-11-05T02-32-18.553422.parquet'
- split: 2023_11_06T21_31_02.629994
path:
- '**/details_harness|winogrande|5_2023-11-06T21-31-02.629994.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-06T21-31-02.629994.parquet'
- config_name: results
data_files:
- split: 2023_11_05T02_32_18.553422
path:
- results_2023-11-05T02-32-18.553422.parquet
- split: 2023_11_06T21_31_02.629994
path:
- results_2023-11-06T21-31-02.629994.parquet
- split: latest
path:
- results_2023-11-06T21-31-02.629994.parquet
---
# Dataset Card for Evaluation run of danielpark/gorani-100k-llama2-13b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/danielpark/gorani-100k-llama2-13b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [danielpark/gorani-100k-llama2-13b-instruct](https://huggingface.co/danielpark/gorani-100k-llama2-13b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_danielpark__gorani-100k-llama2-13b-instruct_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-06T21:31:02.629994](https://huggingface.co/datasets/open-llm-leaderboard/details_danielpark__gorani-100k-llama2-13b-instruct_public/blob/main/results_2023-11-06T21-31-02.629994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 8.284395973154363e-05,
"f1_stderr": 6.061110851297716e-05,
"acc": 0.24822415153906865,
"acc_stderr": 0.0070260655734579345
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 8.284395973154363e-05,
"f1_stderr": 6.061110851297716e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.014052131146915869
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fimu-docproc-research/CIVQA_EasyOCR_Validation | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: answers
dtype: string
- name: bboxes
sequence:
sequence: float64
- name: answers_bboxes
sequence:
sequence: float64
- name: questions
dtype: string
- name: image
dtype: string
splits:
- name: validation
num_bytes: 48446674074
num_examples: 34159
download_size: 10985782991
dataset_size: 48446674074
license: mit
language:
- cs
tags:
- finance
---
# CIVQA EasyOCR Validation Dataset
The CIVQA (Czech Invoice Visual Question Answering) dataset was created with EasyOCR. This dataset contains only the validation split. The train part of the dataset can be found on this URL: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_Train
The encoded validation dataset for the LayoutLM can be found on this link: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_LayoutLM_Validation
All invoices used in this dataset were obtained from public sources. Over these invoices, we were focusing on 15 different entities, which are crucial for processing the invoices.
- Invoice number
- Variable symbol
- Specific symbol
- Constant symbol
- Bank code
- Account number
- ICO
- Total amount
- Invoice date
- Due date
- Name of supplier
- IBAN
- DIC
- QR code
- Supplier's address
The invoices included in this dataset were gathered from the internet. We understand that privacy is of utmost importance. Therefore, we sincerely apologise for any inconvenience caused by including your identifiable information in this dataset. If you have identified your data in this dataset and wish to have it removed from research purposes, we request you kindly to access the following URL: https://forms.gle/tUVJKoB22oeTncUD6
We profoundly appreciate your cooperation and understanding in this matter. |
punwaiw/diffusionJockeyDynamics | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: dynamics
dtype: image
splits:
- name: train
num_bytes: 3946214461.25
num_examples: 17635
download_size: 3916831572
dataset_size: 3946214461.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Shakib75/final-cpp-programs-finetuning | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8193116
num_examples: 5590
download_size: 2750918
dataset_size: 8193116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
severo/fix-401 | ---
viewer: false
---
# Try to include an iframe
from observable:
<iframe width="100%" height="635" frameborder="0"
src="https://observablehq.com/embed/@d3/sortable-bar-chart?cell=viewof+order&cell=chart"></iframe>
from an HF space:
<iframe src="https://hf.space/embed/YoannLemesle/CLIPictionary/+?__theme=system" data-src="https://hf.space/embed/YoannLemesle/CLIPictionary/+" data-sdk="gradio" title="Gradio app" class="container p-0 flex-grow overflow-hidden space-iframe" allow="accelerometer; ambient-light-sensor; autoplay; battery; camera; document-domain; encrypted-media; fullscreen; geolocation; gyroscope; layout-animations; legacy-image-formats; magnetometer; microphone; midi; oversized-images; payment; picture-in-picture; publickey-credentials-get; sync-xhr; usb; vr ; wake-lock; xr-spatial-tracking" sandbox="allow-forms allow-modals allow-popups allow-popups-to-escape-sandbox allow-same-origin allow-scripts allow-downloads" scrolling="no" id="iFrameResizer0" style="overflow: hidden; height: 725px;"></iframe> |
freshpearYoon/train_free_19 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604606832
num_examples: 10000
download_size: 1267563511
dataset_size: 9604606832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jordyvl/rvl_cdip_n_mp | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: id
dtype: string
- name: file
dtype: binary
- name: labels
dtype:
class_label:
names:
'0': letter
'1': form
'2': email
'3': handwritten
'4': advertisement
'5': scientific report
'6': scientific publication
'7': specification
'8': file folder
'9': news article
'10': budget
'11': invoice
'12': presentation
'13': questionnaire
'14': resume
'15': memo
splits:
- name: test
num_bytes: 1349159996
num_examples: 991
download_size: 0
dataset_size: 1349159996
---
# Dataset Card for RVL-CDIP-N_MultiPage
## Extension
The data loader provides support for loading RVL_CDIP-N in its extended multipage format.
Big kudos to the original authors (first in CITATION) for collecting the RVL-CDIP-N dataset.
We stand on the shoulders of giants :)
## Required installation
```bash
pip3 install pypdf2 pdf2image
sudo apt-get install poppler-utils
``` |
nastyboget/stackmix_hkr | ---
license: mit
task_categories:
- image-to-text
language:
- ru
size_categories:
- 100K<n<1M
---
Dataset generated from HKR train set using Stackmix
===================================================
Number of images: 300000
Sources:
* [HKR dataset](https://github.com/abdoelsayed2016/HKR_Dataset)
* [Stackmix code](https://github.com/ai-forever/StackMix-OCR)
|
ShrinivasSK/mr_en_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: tgt
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 4586634.0
num_examples: 18000
- name: test
num_bytes: 509626.0
num_examples: 2000
download_size: 2687176
dataset_size: 5096260.0
---
# Dataset Card for "mr_en_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KeshavRa/Our_Team_Youth_Leaders_Database | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 8229
num_examples: 62
download_size: 6928
dataset_size: 8229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xjlulu/ntu_adl_question | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: validation
path: "validation.csv"
- split: test
path: "test.csv"
- config_name: paragraphs
data_files:
- split: context
path: "context.csv"
- config_name: Preprocess
data_files:
- split: train
path: "Preprocess_train.csv"
- split: validation
path: "Preprocess_validation.csv"
- split: test
path: "Preprocess_test.csv"
- config_name: Viewer_Preprocess
data_files:
- split: validation
path: "Preprocess_validation.csv"
- split: test
path: "Preprocess_test.csv"
license: apache-2.0
task_categories:
- question-answering
language:
- zh
--- |
transcendingvictor/delphi-llama2-12.8m-validation-logprobs | ---
dataset_info:
features:
- name: logprobs
sequence: float64
splits:
- name: validation
num_bytes: 45818277
num_examples: 10982
download_size: 37722133
dataset_size: 45818277
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
hdotta/henry | ---
license: openrail
---
|
FernandoMacia/Dataset1 | ---
size_categories:
- 1K<n<10K
---
Extraído de <https://github.com/anthony-wang/BestPractices/tree/master/data>.
Campos:
* Formula (`string`)
* T (`float64`): Temperatura (K)
* CP (`float64`): Capacidad calorífica (J/mol K) |
fmops/ai-traffic-flows | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 1361185.4960141717
num_examples: 1693
- name: test
num_bytes: 454264.50398582814
num_examples: 565
download_size: 1527131
dataset_size: 1815450.0
---
# Dataset Card for "ai-traffic-flows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EuropeanParliament/Eurovoc | ---
license: eupl-1.1
configs:
- config_name: 1996-03
data_files: "files/1996-03.jsonl.gz"
- config_name: 1996-04
data_files: "files/1996-04.jsonl.gz"
- config_name: 1996-05
data_files: "files/1996-05.jsonl.gz"
- config_name: 1996-06
data_files: "files/1996-06.jsonl.gz"
- config_name: 1996-07
data_files: "files/1996-07.jsonl.gz"
- config_name: 1996-08
data_files: "files/1996-08.jsonl.gz"
- config_name: 1996-09
data_files: "files/1996-09.jsonl.gz"
- config_name: 1996-10
data_files: "files/1996-10.jsonl.gz"
- config_name: 1996-11
data_files: "files/1996-11.jsonl.gz"
- config_name: 1996-12
data_files: "files/1996-12.jsonl.gz"
- config_name: 1997-01
data_files: "files/1997-01.jsonl.gz"
- config_name: 1997-02
data_files: "files/1997-02.jsonl.gz"
- config_name: 1997-03
data_files: "files/1997-03.jsonl.gz"
- config_name: 1997-04
data_files: "files/1997-04.jsonl.gz"
- config_name: 1997-05
data_files: "files/1997-05.jsonl.gz"
- config_name: 1997-06
data_files: "files/1997-06.jsonl.gz"
- config_name: 1997-07
data_files: "files/1997-07.jsonl.gz"
- config_name: 1997-08
data_files: "files/1997-08.jsonl.gz"
- config_name: 1997-09
data_files: "files/1997-09.jsonl.gz"
- config_name: 1997-10
data_files: "files/1997-10.jsonl.gz"
- config_name: 1997-11
data_files: "files/1997-11.jsonl.gz"
- config_name: 1997-12
data_files: "files/1997-12.jsonl.gz"
- config_name: 1998-01
data_files: "files/1998-01.jsonl.gz"
- config_name: 1998-02
data_files: "files/1998-02.jsonl.gz"
- config_name: 1998-03
data_files: "files/1998-03.jsonl.gz"
- config_name: 1998-04
data_files: "files/1998-04.jsonl.gz"
- config_name: 1998-05
data_files: "files/1998-05.jsonl.gz"
- config_name: 1998-06
data_files: "files/1998-06.jsonl.gz"
- config_name: 1998-07
data_files: "files/1998-07.jsonl.gz"
- config_name: 1998-08
data_files: "files/1998-08.jsonl.gz"
- config_name: 1998-09
data_files: "files/1998-09.jsonl.gz"
- config_name: 1998-10
data_files: "files/1998-10.jsonl.gz"
- config_name: 1998-11
data_files: "files/1998-11.jsonl.gz"
- config_name: 1998-12
data_files: "files/1998-12.jsonl.gz"
- config_name: 1999-01
data_files: "files/1999-01.jsonl.gz"
- config_name: 1999-02
data_files: "files/1999-02.jsonl.gz"
- config_name: 1999-03
data_files: "files/1999-03.jsonl.gz"
- config_name: 1999-04
data_files: "files/1999-04.jsonl.gz"
- config_name: 1999-05
data_files: "files/1999-05.jsonl.gz"
- config_name: 1999-06
data_files: "files/1999-06.jsonl.gz"
- config_name: 1999-07
data_files: "files/1999-07.jsonl.gz"
- config_name: 1999-08
data_files: "files/1999-08.jsonl.gz"
- config_name: 1999-09
data_files: "files/1999-09.jsonl.gz"
- config_name: 1999-10
data_files: "files/1999-10.jsonl.gz"
- config_name: 1999-11
data_files: "files/1999-11.jsonl.gz"
- config_name: 1999-12
data_files: "files/1999-12.jsonl.gz"
- config_name: 2000-01
data_files: "files/2000-01.jsonl.gz"
- config_name: 2000-02
data_files: "files/2000-02.jsonl.gz"
- config_name: 2000-03
data_files: "files/2000-03.jsonl.gz"
- config_name: 2000-04
data_files: "files/2000-04.jsonl.gz"
- config_name: 2000-05
data_files: "files/2000-05.jsonl.gz"
- config_name: 2000-06
data_files: "files/2000-06.jsonl.gz"
- config_name: 2000-07
data_files: "files/2000-07.jsonl.gz"
- config_name: 2000-08
data_files: "files/2000-08.jsonl.gz"
- config_name: 2000-09
data_files: "files/2000-09.jsonl.gz"
- config_name: 2000-10
data_files: "files/2000-10.jsonl.gz"
- config_name: 2000-11
data_files: "files/2000-11.jsonl.gz"
- config_name: 2000-12
data_files: "files/2000-12.jsonl.gz"
- config_name: 2001-01
data_files: "files/2001-01.jsonl.gz"
- config_name: 2001-02
data_files: "files/2001-02.jsonl.gz"
- config_name: 2001-03
data_files: "files/2001-03.jsonl.gz"
- config_name: 2001-04
data_files: "files/2001-04.jsonl.gz"
- config_name: 2001-05
data_files: "files/2001-05.jsonl.gz"
- config_name: 2001-06
data_files: "files/2001-06.jsonl.gz"
- config_name: 2001-07
data_files: "files/2001-07.jsonl.gz"
- config_name: 2001-08
data_files: "files/2001-08.jsonl.gz"
- config_name: 2001-09
data_files: "files/2001-09.jsonl.gz"
- config_name: 2001-10
data_files: "files/2001-10.jsonl.gz"
- config_name: 2001-11
data_files: "files/2001-11.jsonl.gz"
- config_name: 2001-12
data_files: "files/2001-12.jsonl.gz"
- config_name: 2002-01
data_files: "files/2002-01.jsonl.gz"
- config_name: 2002-02
data_files: "files/2002-02.jsonl.gz"
- config_name: 2002-03
data_files: "files/2002-03.jsonl.gz"
- config_name: 2002-04
data_files: "files/2002-04.jsonl.gz"
- config_name: 2002-05
data_files: "files/2002-05.jsonl.gz"
- config_name: 2002-06
data_files: "files/2002-06.jsonl.gz"
- config_name: 2002-07
data_files: "files/2002-07.jsonl.gz"
- config_name: 2002-08
data_files: "files/2002-08.jsonl.gz"
- config_name: 2002-09
data_files: "files/2002-09.jsonl.gz"
- config_name: 2002-10
data_files: "files/2002-10.jsonl.gz"
- config_name: 2002-11
data_files: "files/2002-11.jsonl.gz"
- config_name: 2002-12
data_files: "files/2002-12.jsonl.gz"
- config_name: 2003-01
data_files: "files/2003-01.jsonl.gz"
- config_name: 2003-02
data_files: "files/2003-02.jsonl.gz"
- config_name: 2003-03
data_files: "files/2003-03.jsonl.gz"
- config_name: 2003-04
data_files: "files/2003-04.jsonl.gz"
- config_name: 2003-05
data_files: "files/2003-05.jsonl.gz"
- config_name: 2003-06
data_files: "files/2003-06.jsonl.gz"
- config_name: 2003-07
data_files: "files/2003-07.jsonl.gz"
- config_name: 2003-08
data_files: "files/2003-08.jsonl.gz"
- config_name: 2003-09
data_files: "files/2003-09.jsonl.gz"
- config_name: 2003-10
data_files: "files/2003-10.jsonl.gz"
- config_name: 2003-11
data_files: "files/2003-11.jsonl.gz"
- config_name: 2003-12
data_files: "files/2003-12.jsonl.gz"
- config_name: 2004-01
data_files: "files/2004-01.jsonl.gz"
- config_name: 2004-02
data_files: "files/2004-02.jsonl.gz"
- config_name: 2004-03
data_files: "files/2004-03.jsonl.gz"
- config_name: 2004-04
data_files: "files/2004-04.jsonl.gz"
- config_name: 2004-05
data_files: "files/2004-05.jsonl.gz"
- config_name: 2004-06
data_files: "files/2004-06.jsonl.gz"
- config_name: 2004-07
data_files: "files/2004-07.jsonl.gz"
- config_name: 2004-08
data_files: "files/2004-08.jsonl.gz"
- config_name: 2004-09
data_files: "files/2004-09.jsonl.gz"
- config_name: 2004-10
data_files: "files/2004-10.jsonl.gz"
- config_name: 2004-11
data_files: "files/2004-11.jsonl.gz"
- config_name: 2004-12
data_files: "files/2004-12.jsonl.gz"
- config_name: 2005-01
data_files: "files/2005-01.jsonl.gz"
- config_name: 2005-02
data_files: "files/2005-02.jsonl.gz"
- config_name: 2005-03
data_files: "files/2005-03.jsonl.gz"
- config_name: 2005-04
data_files: "files/2005-04.jsonl.gz"
- config_name: 2005-05
data_files: "files/2005-05.jsonl.gz"
- config_name: 2005-06
data_files: "files/2005-06.jsonl.gz"
- config_name: 2005-07
data_files: "files/2005-07.jsonl.gz"
- config_name: 2005-08
data_files: "files/2005-08.jsonl.gz"
- config_name: 2005-09
data_files: "files/2005-09.jsonl.gz"
- config_name: 2005-10
data_files: "files/2005-10.jsonl.gz"
- config_name: 2005-11
data_files: "files/2005-11.jsonl.gz"
- config_name: 2005-12
data_files: "files/2005-12.jsonl.gz"
- config_name: 2006-01
data_files: "files/2006-01.jsonl.gz"
- config_name: 2006-02
data_files: "files/2006-02.jsonl.gz"
- config_name: 2006-03
data_files: "files/2006-03.jsonl.gz"
- config_name: 2006-04
data_files: "files/2006-04.jsonl.gz"
- config_name: 2006-05
data_files: "files/2006-05.jsonl.gz"
- config_name: 2006-06
data_files: "files/2006-06.jsonl.gz"
- config_name: 2006-07
data_files: "files/2006-07.jsonl.gz"
- config_name: 2006-08
data_files: "files/2006-08.jsonl.gz"
- config_name: 2006-09
data_files: "files/2006-09.jsonl.gz"
- config_name: 2006-10
data_files: "files/2006-10.jsonl.gz"
- config_name: 2006-11
data_files: "files/2006-11.jsonl.gz"
- config_name: 2006-12
data_files: "files/2006-12.jsonl.gz"
- config_name: 2007-01
data_files: "files/2007-01.jsonl.gz"
- config_name: 2007-02
data_files: "files/2007-02.jsonl.gz"
- config_name: 2007-03
data_files: "files/2007-03.jsonl.gz"
- config_name: 2007-04
data_files: "files/2007-04.jsonl.gz"
- config_name: 2007-05
data_files: "files/2007-05.jsonl.gz"
- config_name: 2007-06
data_files: "files/2007-06.jsonl.gz"
- config_name: 2007-07
data_files: "files/2007-07.jsonl.gz"
- config_name: 2007-08
data_files: "files/2007-08.jsonl.gz"
- config_name: 2007-09
data_files: "files/2007-09.jsonl.gz"
- config_name: 2007-10
data_files: "files/2007-10.jsonl.gz"
- config_name: 2007-11
data_files: "files/2007-11.jsonl.gz"
- config_name: 2007-12
data_files: "files/2007-12.jsonl.gz"
- config_name: 2008-01
data_files: "files/2008-01.jsonl.gz"
- config_name: 2008-02
data_files: "files/2008-02.jsonl.gz"
- config_name: 2008-03
data_files: "files/2008-03.jsonl.gz"
- config_name: 2008-04
data_files: "files/2008-04.jsonl.gz"
- config_name: 2008-05
data_files: "files/2008-05.jsonl.gz"
- config_name: 2008-06
data_files: "files/2008-06.jsonl.gz"
- config_name: 2008-07
data_files: "files/2008-07.jsonl.gz"
- config_name: 2008-08
data_files: "files/2008-08.jsonl.gz"
- config_name: 2008-09
data_files: "files/2008-09.jsonl.gz"
- config_name: 2008-10
data_files: "files/2008-10.jsonl.gz"
- config_name: 2008-11
data_files: "files/2008-11.jsonl.gz"
- config_name: 2008-12
data_files: "files/2008-12.jsonl.gz"
- config_name: 2009-01
data_files: "files/2009-01.jsonl.gz"
- config_name: 2009-02
data_files: "files/2009-02.jsonl.gz"
- config_name: 2009-03
data_files: "files/2009-03.jsonl.gz"
- config_name: 2009-04
data_files: "files/2009-04.jsonl.gz"
- config_name: 2009-05
data_files: "files/2009-05.jsonl.gz"
- config_name: 2009-06
data_files: "files/2009-06.jsonl.gz"
- config_name: 2009-07
data_files: "files/2009-07.jsonl.gz"
- config_name: 2009-08
data_files: "files/2009-08.jsonl.gz"
- config_name: 2009-09
data_files: "files/2009-09.jsonl.gz"
- config_name: 2009-10
data_files: "files/2009-10.jsonl.gz"
- config_name: 2009-11
data_files: "files/2009-11.jsonl.gz"
- config_name: 2009-12
data_files: "files/2009-12.jsonl.gz"
- config_name: 2010-01
data_files: "files/2010-01.jsonl.gz"
- config_name: 2010-02
data_files: "files/2010-02.jsonl.gz"
- config_name: 2010-03
data_files: "files/2010-03.jsonl.gz"
- config_name: 2010-04
data_files: "files/2010-04.jsonl.gz"
- config_name: 2010-05
data_files: "files/2010-05.jsonl.gz"
- config_name: 2010-06
data_files: "files/2010-06.jsonl.gz"
- config_name: 2010-07
data_files: "files/2010-07.jsonl.gz"
- config_name: 2010-08
data_files: "files/2010-08.jsonl.gz"
- config_name: 2010-09
data_files: "files/2010-09.jsonl.gz"
- config_name: 2010-10
data_files: "files/2010-10.jsonl.gz"
- config_name: 2010-11
data_files: "files/2010-11.jsonl.gz"
- config_name: 2010-12
data_files: "files/2010-12.jsonl.gz"
- config_name: 2011-01
data_files: "files/2011-01.jsonl.gz"
- config_name: 2011-02
data_files: "files/2011-02.jsonl.gz"
- config_name: 2011-03
data_files: "files/2011-03.jsonl.gz"
- config_name: 2011-04
data_files: "files/2011-04.jsonl.gz"
- config_name: 2011-05
data_files: "files/2011-05.jsonl.gz"
- config_name: 2011-06
data_files: "files/2011-06.jsonl.gz"
- config_name: 2011-07
data_files: "files/2011-07.jsonl.gz"
- config_name: 2011-08
data_files: "files/2011-08.jsonl.gz"
- config_name: 2011-09
data_files: "files/2011-09.jsonl.gz"
- config_name: 2011-10
data_files: "files/2011-10.jsonl.gz"
- config_name: 2011-11
data_files: "files/2011-11.jsonl.gz"
- config_name: 2011-12
data_files: "files/2011-12.jsonl.gz"
- config_name: 2012-01
data_files: "files/2012-01.jsonl.gz"
- config_name: 2012-02
data_files: "files/2012-02.jsonl.gz"
- config_name: 2012-03
data_files: "files/2012-03.jsonl.gz"
- config_name: 2012-04
data_files: "files/2012-04.jsonl.gz"
- config_name: 2012-05
data_files: "files/2012-05.jsonl.gz"
- config_name: 2012-06
data_files: "files/2012-06.jsonl.gz"
- config_name: 2012-07
data_files: "files/2012-07.jsonl.gz"
- config_name: 2012-08
data_files: "files/2012-08.jsonl.gz"
- config_name: 2012-09
data_files: "files/2012-09.jsonl.gz"
- config_name: 2012-10
data_files: "files/2012-10.jsonl.gz"
- config_name: 2012-11
data_files: "files/2012-11.jsonl.gz"
- config_name: 2012-12
data_files: "files/2012-12.jsonl.gz"
- config_name: 2013-01
data_files: "files/2013-01.jsonl.gz"
- config_name: 2013-02
data_files: "files/2013-02.jsonl.gz"
- config_name: 2013-03
data_files: "files/2013-03.jsonl.gz"
- config_name: 2013-04
data_files: "files/2013-04.jsonl.gz"
- config_name: 2013-05
data_files: "files/2013-05.jsonl.gz"
- config_name: 2013-06
data_files: "files/2013-06.jsonl.gz"
- config_name: 2013-07
data_files: "files/2013-07.jsonl.gz"
- config_name: 2013-08
data_files: "files/2013-08.jsonl.gz"
- config_name: 2013-09
data_files: "files/2013-09.jsonl.gz"
- config_name: 2013-10
data_files: "files/2013-10.jsonl.gz"
- config_name: 2013-11
data_files: "files/2013-11.jsonl.gz"
- config_name: 2013-12
data_files: "files/2013-12.jsonl.gz"
- config_name: 2014-01
data_files: "files/2014-01.jsonl.gz"
- config_name: 2014-02
data_files: "files/2014-02.jsonl.gz"
- config_name: 2014-03
data_files: "files/2014-03.jsonl.gz"
- config_name: 2014-04
data_files: "files/2014-04.jsonl.gz"
- config_name: 2014-05
data_files: "files/2014-05.jsonl.gz"
- config_name: 2014-06
data_files: "files/2014-06.jsonl.gz"
- config_name: 2014-07
data_files: "files/2014-07.jsonl.gz"
- config_name: 2014-08
data_files: "files/2014-08.jsonl.gz"
- config_name: 2014-09
data_files: "files/2014-09.jsonl.gz"
- config_name: 2014-10
data_files: "files/2014-10.jsonl.gz"
- config_name: 2014-11
data_files: "files/2014-11.jsonl.gz"
- config_name: 2014-12
data_files: "files/2014-12.jsonl.gz"
- config_name: 2015-01
data_files: "files/2015-01.jsonl.gz"
- config_name: 2015-02
data_files: "files/2015-02.jsonl.gz"
- config_name: 2015-03
data_files: "files/2015-03.jsonl.gz"
- config_name: 2015-04
data_files: "files/2015-04.jsonl.gz"
- config_name: 2015-05
data_files: "files/2015-05.jsonl.gz"
- config_name: 2015-06
data_files: "files/2015-06.jsonl.gz"
- config_name: 2015-07
data_files: "files/2015-07.jsonl.gz"
- config_name: 2015-08
data_files: "files/2015-08.jsonl.gz"
- config_name: 2015-09
data_files: "files/2015-09.jsonl.gz"
- config_name: 2015-10
data_files: "files/2015-10.jsonl.gz"
- config_name: 2015-11
data_files: "files/2015-11.jsonl.gz"
- config_name: 2015-12
data_files: "files/2015-12.jsonl.gz"
- config_name: 2016-01
data_files: "files/2016-01.jsonl.gz"
- config_name: 2016-02
data_files: "files/2016-02.jsonl.gz"
- config_name: 2016-03
data_files: "files/2016-03.jsonl.gz"
- config_name: 2016-04
data_files: "files/2016-04.jsonl.gz"
- config_name: 2016-05
data_files: "files/2016-05.jsonl.gz"
- config_name: 2016-06
data_files: "files/2016-06.jsonl.gz"
- config_name: 2016-07
data_files: "files/2016-07.jsonl.gz"
- config_name: 2016-08
data_files: "files/2016-08.jsonl.gz"
- config_name: 2016-09
data_files: "files/2016-09.jsonl.gz"
- config_name: 2016-10
data_files: "files/2016-10.jsonl.gz"
- config_name: 2016-11
data_files: "files/2016-11.jsonl.gz"
- config_name: 2016-12
data_files: "files/2016-12.jsonl.gz"
- config_name: 2017-01
data_files: "files/2017-01.jsonl.gz"
- config_name: 2017-02
data_files: "files/2017-02.jsonl.gz"
- config_name: 2017-03
data_files: "files/2017-03.jsonl.gz"
- config_name: 2017-04
data_files: "files/2017-04.jsonl.gz"
- config_name: 2017-05
data_files: "files/2017-05.jsonl.gz"
- config_name: 2017-06
data_files: "files/2017-06.jsonl.gz"
- config_name: 2017-07
data_files: "files/2017-07.jsonl.gz"
- config_name: 2017-08
data_files: "files/2017-08.jsonl.gz"
- config_name: 2017-09
data_files: "files/2017-09.jsonl.gz"
- config_name: 2017-10
data_files: "files/2017-10.jsonl.gz"
- config_name: 2017-11
data_files: "files/2017-11.jsonl.gz"
- config_name: 2017-12
data_files: "files/2017-12.jsonl.gz"
- config_name: 2018-01
data_files: "files/2018-01.jsonl.gz"
- config_name: 2018-02
data_files: "files/2018-02.jsonl.gz"
- config_name: 2018-03
data_files: "files/2018-03.jsonl.gz"
- config_name: 2018-04
data_files: "files/2018-04.jsonl.gz"
- config_name: 2018-05
data_files: "files/2018-05.jsonl.gz"
- config_name: 2018-06
data_files: "files/2018-06.jsonl.gz"
- config_name: 2018-07
data_files: "files/2018-07.jsonl.gz"
- config_name: 2018-08
data_files: "files/2018-08.jsonl.gz"
- config_name: 2018-09
data_files: "files/2018-09.jsonl.gz"
- config_name: 2018-10
data_files: "files/2018-10.jsonl.gz"
- config_name: 2018-11
data_files: "files/2018-11.jsonl.gz"
- config_name: 2018-12
data_files: "files/2018-12.jsonl.gz"
- config_name: 2019-01
data_files: "files/2019-01.jsonl.gz"
- config_name: 2019-02
data_files: "files/2019-02.jsonl.gz"
- config_name: 2019-03
data_files: "files/2019-03.jsonl.gz"
- config_name: 2019-04
data_files: "files/2019-04.jsonl.gz"
- config_name: 2019-05
data_files: "files/2019-05.jsonl.gz"
- config_name: 2019-06
data_files: "files/2019-06.jsonl.gz"
- config_name: 2019-07
data_files: "files/2019-07.jsonl.gz"
- config_name: 2019-08
data_files: "files/2019-08.jsonl.gz"
- config_name: 2019-09
data_files: "files/2019-09.jsonl.gz"
- config_name: 2019-10
data_files: "files/2019-10.jsonl.gz"
- config_name: 2019-11
data_files: "files/2019-11.jsonl.gz"
- config_name: 2019-12
data_files: "files/2019-12.jsonl.gz"
- config_name: 2020-01
data_files: "files/2020-01.jsonl.gz"
- config_name: 2020-02
data_files: "files/2020-02.jsonl.gz"
- config_name: 2020-03
data_files: "files/2020-03.jsonl.gz"
- config_name: 2020-04
data_files: "files/2020-04.jsonl.gz"
- config_name: 2020-05
data_files: "files/2020-05.jsonl.gz"
- config_name: 2020-06
data_files: "files/2020-06.jsonl.gz"
- config_name: 2020-07
data_files: "files/2020-07.jsonl.gz"
- config_name: 2020-08
data_files: "files/2020-08.jsonl.gz"
- config_name: 2020-09
data_files: "files/2020-09.jsonl.gz"
- config_name: 2020-10
data_files: "files/2020-10.jsonl.gz"
- config_name: 2020-11
data_files: "files/2020-11.jsonl.gz"
- config_name: 2020-12
data_files: "files/2020-12.jsonl.gz"
- config_name: 2021-01
data_files: "files/2021-01.jsonl.gz"
- config_name: 2021-02
data_files: "files/2021-02.jsonl.gz"
- config_name: 2021-03
data_files: "files/2021-03.jsonl.gz"
- config_name: 2021-04
data_files: "files/2021-04.jsonl.gz"
- config_name: 2021-05
data_files: "files/2021-05.jsonl.gz"
- config_name: 2021-06
data_files: "files/2021-06.jsonl.gz"
- config_name: 2021-07
data_files: "files/2021-07.jsonl.gz"
- config_name: 2021-08
data_files: "files/2021-08.jsonl.gz"
- config_name: 2021-09
data_files: "files/2021-09.jsonl.gz"
- config_name: 2021-10
data_files: "files/2021-10.jsonl.gz"
- config_name: 2021-11
data_files: "files/2021-11.jsonl.gz"
- config_name: 2021-12
data_files: "files/2021-12.jsonl.gz"
- config_name: 2022-01
data_files: "files/2022-01.jsonl.gz"
- config_name: 2022-02
data_files: "files/2022-02.jsonl.gz"
- config_name: 2022-03
data_files: "files/2022-03.jsonl.gz"
- config_name: 2022-04
data_files: "files/2022-04.jsonl.gz"
- config_name: 2022-05
data_files: "files/2022-05.jsonl.gz"
- config_name: 2022-06
data_files: "files/2022-06.jsonl.gz"
- config_name: 2022-07
data_files: "files/2022-07.jsonl.gz"
- config_name: 2022-08
data_files: "files/2022-08.jsonl.gz"
- config_name: 2022-09
data_files: "files/2022-09.jsonl.gz"
- config_name: 2022-10
data_files: "files/2022-10.jsonl.gz"
- config_name: 2022-11
data_files: "files/2022-11.jsonl.gz"
- config_name: 2022-12
data_files: "files/2022-12.jsonl.gz"
- config_name: 2023-01
data_files: "files/2023-01.jsonl.gz"
- config_name: 2023-02
data_files: "files/2023-02.jsonl.gz"
- config_name: 2023-03
data_files: "files/2023-03.jsonl.gz"
- config_name: 2023-04
data_files: "files/2023-04.jsonl.gz"
- config_name: 2023-05
data_files: "files/2023-05.jsonl.gz"
- config_name: 2023-06
data_files: "files/2023-06.jsonl.gz"
- config_name: 2023-07
data_files: "files/2023-07.jsonl.gz"
- config_name: 2023-08
data_files: "files/2023-08.jsonl.gz"
- config_name: 2023-09
data_files: "files/2023-09.jsonl.gz"
- config_name: 2023-10
data_files: "files/2023-10.jsonl.gz"
- config_name: 2023-11
data_files: "files/2023-11.jsonl.gz"
- config_name: 2023-12
data_files: "files/2023-12.jsonl.gz"
---
# 🇪🇺 🏷️ EuroVoc dataset
This dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.
## What's Cellar ?
Cellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.
https://op.europa.eu/fr/web/cellar
## Why was this dataset created ?
"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples."
https://medium.com/datapy-ai/extreme-multi-label-classification-for-eurovoc-b51d74623820
## How was dataset this created ?
The source code is available, check `cellar.py`
## When this dataset was created ?
14 July 2023
## What are the main characteristics of this dataset ?
There are a total of 39 different languages present in this dataset, of which some are EU languages and some are not. As the following graph illustrates, most of the documents of the dataset are written in EU languages (English being the most present language in the dataset), and the non-EU languages are very poorly represented (for example Arabic, Japanese,...). Note that since the Irish language (`gle`) was granted full official and working status in the EU in 2022, there are very few documents in that language. Additionally, Croatian (`hrv`) is also less represented in the dataset as Croatia is the latest country to have joined the EU in 2013.

The lengths of the documents also varies depending on the language it is written in. The document lengths are quite variable, especially in English. There is therefore a quite large disparity in document lengths in this dataset. Note that this boxplot does not present the outliers, since the length of certain documents can contain up to 86 million characters. The red lines in the boxplot indicates the median length of the documents for each language.

We notice that the documents in Irish have a very wide variability in document lengths, due to the fact it has very few documents. Therefore, we present the same boxplot without the Irish language in order to visualize with more detail the document length distribution in the other languages.

## How is the data structured ?
An example of a sample of this dataset is the following :
```json
{
"title": "Commission information notice...",
"date": "2023-09-29",
"eurovoc_concepts": ["air transport", "intra-EU transport"],
"url": "http://publications.europa.eu/resource/cellar/ec99987f-5e69-11ee-9220-01aa75ed71a1",
"lang": "eng",
"formats": ["fmx4", "pdfa2a", "xhtml"],
"text": "To ensure ownership by the relevant actors,..."
}
```
- `title` : title of the document
- `date` : publication date of the document
- `eurovoc_concepts` : list of the EuroVoc concepts related to this document
- `url` : URL to access the document
- `formats` : list of formats in which the original document is available
- `text` : text content of the document
## Bibliography
- Ilias Chalkidis, Emmanouil Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2019. Extreme Multi-Label Legal Text Classification: A Case Study in EU Legislation. In Proceedings of the Natural Legal Language Processing Workshop 2019, pages 78–87, Minneapolis, Minnesota. Association for Computational Linguistics.
- I. Chalkidis, M. Fergadiotis, P. Malakasiotis and I. Androutsopoulos, Large-Scale Multi-Label Text Classification on EU Legislation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), Florence, Italy, (short papers), 2019.
- Andrei-Marius Avram, Vasile Pais, and Dan Ioan Tufis. 2021. PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 92–101, Held Online. INCOMA Ltd..
- SHAHEEN, Zein, WOHLGENANNT, Gerhard, et FILTZ, Erwin. Large scale legal text classification using transformer models. arXiv preprint arXiv:2010.12871, 2020.
## Author(s)
Sébastien Campion <sebastien.campion@europarl.europa.eu>
|
El-chapoo/Complex_data-v1.1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1272300153
num_examples: 794234
download_size: 592509110
dataset_size: 1272300153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tellarin-ai/ntx_llm_inst_turkish | ---
license: cc-by-sa-4.0
language:
- tr
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Turkish subset
This dataset is a format conversion for the Turkish data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
autoevaluate/autoeval-staging-eval-project-72b4615f-7404801 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: osanseviero/distilbert-base-uncased-finetuned-squad-d5716d28
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: train
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: osanseviero/distilbert-base-uncased-finetuned-squad-d5716d28
* Dataset: adversarial_qa
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@osanseviero](https://huggingface.co/osanseviero) for evaluating this model. |
open-llm-leaderboard/details_Undi95__CreativityEngine | ---
pretty_name: Evaluation run of Undi95/CreativityEngine
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/CreativityEngine](https://huggingface.co/Undi95/CreativityEngine) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__CreativityEngine\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T03:34:54.369545](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CreativityEngine/blob/main/results_2023-10-28T03-34-54.369545.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24706375838926176,\n\
\ \"em_stderr\": 0.00441695804511364,\n \"f1\": 0.32981753355704885,\n\
\ \"f1_stderr\": 0.004357223834591547,\n \"acc\": 0.4187184690035083,\n\
\ \"acc_stderr\": 0.010197442302564062\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24706375838926176,\n \"em_stderr\": 0.00441695804511364,\n\
\ \"f1\": 0.32981753355704885,\n \"f1_stderr\": 0.004357223834591547\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09552691432903715,\n \
\ \"acc_stderr\": 0.008096605771155733\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.01229827883397239\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/CreativityEngine
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T03_34_54.369545
path:
- '**/details_harness|drop|3_2023-10-28T03-34-54.369545.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T03-34-54.369545.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T03_34_54.369545
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-34-54.369545.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-34-54.369545.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-22-32.752077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T17-22-32.752077.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T03_34_54.369545
path:
- '**/details_harness|winogrande|5_2023-10-28T03-34-54.369545.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T03-34-54.369545.parquet'
- config_name: results
data_files:
- split: 2023_09_11T17_22_32.752077
path:
- results_2023-09-11T17-22-32.752077.parquet
- split: 2023_10_28T03_34_54.369545
path:
- results_2023-10-28T03-34-54.369545.parquet
- split: latest
path:
- results_2023-10-28T03-34-54.369545.parquet
---
# Dataset Card for Evaluation run of Undi95/CreativityEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/CreativityEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/CreativityEngine](https://huggingface.co/Undi95/CreativityEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__CreativityEngine",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T03:34:54.369545](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CreativityEngine/blob/main/results_2023-10-28T03-34-54.369545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24706375838926176,
"em_stderr": 0.00441695804511364,
"f1": 0.32981753355704885,
"f1_stderr": 0.004357223834591547,
"acc": 0.4187184690035083,
"acc_stderr": 0.010197442302564062
},
"harness|drop|3": {
"em": 0.24706375838926176,
"em_stderr": 0.00441695804511364,
"f1": 0.32981753355704885,
"f1_stderr": 0.004357223834591547
},
"harness|gsm8k|5": {
"acc": 0.09552691432903715,
"acc_stderr": 0.008096605771155733
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.01229827883397239
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_clause_final_really_but | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5173
num_examples: 33
- name: test
num_bytes: 41200
num_examples: 278
- name: train
num_bytes: 46754
num_examples: 303
download_size: 61836
dataset_size: 93127
---
# Dataset Card for "MULTI_VALUE_qqp_clause_final_really_but"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vanzill/_2_xxoamix_2_ | ---
license: cc
---
|
sorbhet/traindacted | ---
license: apache-2.0
---
|
polinaeterna/cail2018 | ---
dataset_info:
features:
- name: fact
dtype: string
- name: relevant_articles
sequence: int32
- name: accusation
sequence: string
- name: punish_of_money
dtype: float32
- name: criminals
sequence: string
- name: death_penalty
dtype: bool
- name: imprisonment
dtype: float32
- name: life_imprisonment
dtype: bool
splits:
- name: exercise_contest_train
num_bytes: 220112348
num_examples: 154592
- name: exercise_contest_valid
num_bytes: 21702109
num_examples: 17131
- name: exercise_contest_test
num_bytes: 41057538
num_examples: 32508
- name: first_stage_train
num_bytes: 1779653382
num_examples: 1710856
- name: first_stage_test
num_bytes: 244334666
num_examples: 217016
- name: final_test
num_bytes: 44194611
num_examples: 35922
download_size: 1167865756
dataset_size: 2351054654
configs:
- config_name: default
data_files:
- split: exercise_contest_train
path: data/exercise_contest_train-*
- split: exercise_contest_valid
path: data/exercise_contest_valid-*
- split: exercise_contest_test
path: data/exercise_contest_test-*
- split: first_stage_train
path: data/first_stage_train-*
- split: first_stage_test
path: data/first_stage_test-*
- split: final_test
path: data/final_test-*
---
|
distilled-from-one-sec-cv12/chunk_163 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1137477008
num_examples: 221644
download_size: 1160129594
dataset_size: 1137477008
---
# Dataset Card for "chunk_163"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
li-ping/all_pdf_dataset_1029_416data | ---
dataset_info:
features:
- name: set
struct:
- name: neg
sequence: string
- name: pos
sequence: string
- name: query
dtype: string
splits:
- name: train
num_bytes: 20453375
num_examples: 8072
download_size: 698908
dataset_size: 20453375
---
# Dataset Card for "all_pdf_dataset_1029_416data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_splm__zephyr-7b-sft-full-spin-peft-iter2 | ---
pretty_name: Evaluation run of splm/zephyr-7b-sft-full-spin-peft-iter2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [splm/zephyr-7b-sft-full-spin-peft-iter2](https://huggingface.co/splm/zephyr-7b-sft-full-spin-peft-iter2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_splm__zephyr-7b-sft-full-spin-peft-iter2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T13:50:16.813938](https://huggingface.co/datasets/open-llm-leaderboard/details_splm__zephyr-7b-sft-full-spin-peft-iter2/blob/main/results_2024-02-09T13-50-16.813938.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5987875444384797,\n\
\ \"acc_stderr\": 0.03303710140961152,\n \"acc_norm\": 0.6052818921495978,\n\
\ \"acc_norm_stderr\": 0.03372605811236996,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4178934110711531,\n\
\ \"mc2_stderr\": 0.014676975153876327\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496442,\n\
\ \"acc_norm\": 0.5802047781569966,\n \"acc_norm_stderr\": 0.014422181226303028\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6089424417446724,\n\
\ \"acc_stderr\": 0.004869899297734548,\n \"acc_norm\": 0.8077076279625572,\n\
\ \"acc_norm_stderr\": 0.0039329609740080766\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934493,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934493\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.016062290671110462,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.016062290671110462\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354032,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354032\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4178934110711531,\n\
\ \"mc2_stderr\": 0.014676975153876327\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650882\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2850644427596664,\n \
\ \"acc_stderr\": 0.01243504233490401\n }\n}\n```"
repo_url: https://huggingface.co/splm/zephyr-7b-sft-full-spin-peft-iter2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-50-16.813938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-50-16.813938.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- '**/details_harness|winogrande|5_2024-02-09T13-50-16.813938.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T13-50-16.813938.parquet'
- config_name: results
data_files:
- split: 2024_02_09T13_50_16.813938
path:
- results_2024-02-09T13-50-16.813938.parquet
- split: latest
path:
- results_2024-02-09T13-50-16.813938.parquet
---
# Dataset Card for Evaluation run of splm/zephyr-7b-sft-full-spin-peft-iter2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [splm/zephyr-7b-sft-full-spin-peft-iter2](https://huggingface.co/splm/zephyr-7b-sft-full-spin-peft-iter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_splm__zephyr-7b-sft-full-spin-peft-iter2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T13:50:16.813938](https://huggingface.co/datasets/open-llm-leaderboard/details_splm__zephyr-7b-sft-full-spin-peft-iter2/blob/main/results_2024-02-09T13-50-16.813938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5987875444384797,
"acc_stderr": 0.03303710140961152,
"acc_norm": 0.6052818921495978,
"acc_norm_stderr": 0.03372605811236996,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4178934110711531,
"mc2_stderr": 0.014676975153876327
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496442,
"acc_norm": 0.5802047781569966,
"acc_norm_stderr": 0.014422181226303028
},
"harness|hellaswag|10": {
"acc": 0.6089424417446724,
"acc_stderr": 0.004869899297734548,
"acc_norm": 0.8077076279625572,
"acc_norm_stderr": 0.0039329609740080766
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934493,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934493
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.016062290671110462,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.016062290671110462
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336461,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354032,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4178934110711531,
"mc2_stderr": 0.014676975153876327
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650882
},
"harness|gsm8k|5": {
"acc": 0.2850644427596664,
"acc_stderr": 0.01243504233490401
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Squish42/bluemoon-fandom-1-1-rp-cleaned | ---
language:
- en
pretty_name: "Bluemoon - Fandom 1x1 Roleplay"
tags:
- not-for-all-audiences
- roleplay
- creative
license: wtfpl
task_categories:
- conversational
- text-generation
size_categories:
- 100K<n<1M
---
290,544 posts of roleplay forum data scraped by a third party. The source data is not available here.
It should be effective when used to finetune for one-one roleplay and creative writing.
Additionally, it may help to generate various fanfiction-style writing and scenarios.
The `dataset.yaml` file contains the SHA512 hash of the source data and accurately describes each step resulting in this
dataset.
This dataset has been cleaned and formatted for use with fastchat.

 |
Galahad3x/QADatasetForPatho | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 35253904
num_examples: 1567
- name: test
num_bytes: 9123066
num_examples: 339
download_size: 655278
dataset_size: 44376970
---
# Dataset Card for "QADatasetForPatho"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_129 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1129960628
num_examples: 221909
download_size: 1154208586
dataset_size: 1129960628
---
# Dataset Card for "chunk_129"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LuisaRomana/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 294610
dataset_size: 538897.0
---
# Dataset Card for "adv-ele"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
IvanD2002/Task_Dataset | ---
license: openrail
---
|
Circularmachines/batch_indexing_machine_230529_011 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 156625309.0
num_examples: 720
download_size: 156636586
dataset_size: 156625309.0
---
# Dataset Card for "batch_indexing_machine_230529_011"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-xsum-8015d52c-11325509 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: tuner007/pegasus_summarizer
metrics: ['accuracy']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: tuner007/pegasus_summarizer
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Neez](https://huggingface.co/Neez) for evaluating this model. |
bhadresh-savani/translate_code_geeksforgeeks_for_t5 | ---
license: mit
language:
- en
tags:
- code
- c++
- python
- java
pretty_name: CodeT5-Translate
size_categories:
- 1K<n<10K
--- |
kanishka/counterfactual_babylm_pipps_10k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 586264365
num_examples: 11642617
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 424672619
dataset_size: 642384595
---
# Dataset Card for "counterfactual_babylm_pipps_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_82 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1355276536
num_examples: 266158
download_size: 1381999770
dataset_size: 1355276536
---
# Dataset Card for "chunk_82"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
augtoma/usmle_step_2 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: F
dtype: string
- name: G
dtype: string
- name: answer
dtype: string
- name: answer_idx
dtype: string
splits:
- name: test
num_bytes: 133267
num_examples: 109
download_size: 80679
dataset_size: 133267
---
# Dataset Card for "usmle_self_eval_step2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_proximal_distal_demonstratives | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1034
num_examples: 5
- name: test
num_bytes: 3445
num_examples: 13
- name: train
num_bytes: 10422
num_examples: 44
download_size: 15062
dataset_size: 14901
---
# Dataset Card for "MULTI_VALUE_wnli_proximal_distal_demonstratives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BigRedAI/Ithaca-Restaurants | ---
license: mit
---
|
jtatman/civil_comments_hatebert | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_masked
dtype: string
- name: text_replaced
list:
- name: score
dtype: float64
- name: sequence
dtype: string
- name: token
dtype: int64
- name: token_str
dtype: string
splits:
- name: train
num_bytes: 872262083
num_examples: 451219
download_size: 333147199
dataset_size: 872262083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-classification
- text2text-generation
- fill-mask
language:
- en
tags:
- masked
- mask-scored
- comment scoring
- masked-model
pretty_name: civil comments w/hatebert scoring
size_categories:
- 100K<n<1M
---
# Dataset Card for "civil_comments_hatebert"
This is an experiment to see how "civil-comments" can be changed by models without much manipulation to offensive speech in certain cases.
This data is a reformat of the civil comments dataset, discarding all scoring attributes of abusive speech, masking random tokens, and processing with hatebert to fill-masked tokens with possible abusive language.
This merely sets up some good data for three things: fill-mask activities, text training, and scored responses based on random tokens being manipulatible according to this model.
Showing the progress of incarnation, three columns illustrate the original text data extracted, the randomly masked text, and the filled text with scores in a list for the hatebert output.
So far in practice, the hatebert model mostly fills with innocuous placeholders, from *very* limited testing.
Hatebert is as it sounds, a BERT based model trained on fill-mask activites.
[civil_comments dataset](https://huggingface.co/datasets/civil_comments)
[hatebert model](https://huggingface.co/datasets/civil_comments)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rkdeva/QA_Dataset-2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 249203
num_examples: 103
download_size: 112062
dataset_size: 249203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "QA_Dataset-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainThenObtain-ai/Utra-mini-GPT-4 | ---
license: cc
---
|
CyberHarem/asuna_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of asuna/一之瀬アスナ/明日奈 (Blue Archive)
This is the dataset of asuna/一之瀬アスナ/明日奈 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, blue_eyes, light_brown_hair, hair_over_one_eye, halo, ribbon, very_long_hair, bow, hair_ribbon, blue_bow, mole, mole_on_breast, asymmetrical_bangs, blue_ribbon, animal_ears, fake_animal_ears, rabbit_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 950.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1387 | 1.90 GiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/asuna_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blue_bowtie, blue_leotard, brown_pantyhose, cleavage, detached_collar, elbow_gloves, highleg_leotard, looking_at_viewer, official_alternate_costume, playboy_bunny, solo, strapless_leotard, thighband_pantyhose, traditional_bowtie, white_gloves, blush, grin, black_pantyhose, sitting, blonde_hair |
| 1 | 12 |  |  |  |  |  | 1girl, bare_shoulders, blue_bowtie, blue_leotard, blush, cleavage, detached_collar, looking_at_viewer, official_alternate_costume, playboy_bunny, simple_background, solo, strapless_leotard, white_background, white_gloves, black_pantyhose, traditional_bowtie, grin, thighband_pantyhose, elbow_gloves, heart, highleg_leotard, sitting |
| 2 | 7 |  |  |  |  |  | 1girl, alternate_costume, cleavage, cosplay, holding_pom_poms, millennium_cheerleader_outfit_(blue_archive), solo, bare_shoulders, blush, crop_top, detached_collar, grin, looking_at_viewer, navel, stomach, white_skirt, long_bangs, midriff, miniskirt, simple_background, pleated_skirt, white_background, armpits, one_eye_covered |
| 3 | 27 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, maid_headdress, solo, white_apron, black_dress, white_gloves, blue_bowtie, elbow_gloves, puffy_short_sleeves, white_thighhighs, blush, choker, maid_apron, simple_background, white_background, frilled_apron, grin, garter_straps |
| 4 | 26 |  |  |  |  |  | 1girl, collared_shirt, looking_at_viewer, pleated_skirt, school_uniform, solo, white_shirt, grin, black_skirt, simple_background, wrist_scrunchie, white_background, black_choker, blue_bowtie, blush, one_eye_covered, blue_nails, nail_polish, blue_scrunchie, shirt_tucked_in |
| 5 | 11 |  |  |  |  |  | 1girl, blue_nails, collared_shirt, grin, looking_at_viewer, nail_polish, pleated_skirt, school_uniform, solo, white_shirt, blush, wrist_scrunchie, black_choker, black_skirt, button_gap, blue_bowtie, cleavage, shirt_tucked_in, blue_halo, collarbone, eyes_visible_through_hair, school_bag, thighs, white_background |
| 6 | 7 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, blush, cleavage, grin, looking_at_viewer, solo, blue_halo, thighs, collarbone, blonde_hair, nail_polish, sitting, black_shorts, blue_dress, blue_nails, camisole, choker, dolphin_shorts, indoors, long_bangs, short_shorts |
| 7 | 10 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, blush, cleavage, looking_at_viewer, navel, outdoors, solo, stomach, collarbone, blue_sky, day, blue_bikini, grin, side-tie_bikini_bottom, sideboob, water, halterneck, nail_polish, ocean, thighs, beach, black_choker, blonde_hair, blue_nails, open_mouth, string_bikini |
| 8 | 6 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, obi, solo, wide_sleeves, floral_print, grin, blue_nails, blush, holding, nail_polish, blue_halo, blue_kimono, hair_bun, long_sleeves, print_kimono, white_kimono, yukata |
| 9 | 7 |  |  |  |  |  | blush, interracial, nude, 1boy, 1girl, dark-skinned_male, nipples, sex, solo_focus, choker, erection, large_penis, testicles, anal, blue_halo, cum_in_ass, ejaculating_while_penetrated, futa_with_male, grin, hetero, navel, spread_legs, sweat, uncensored, white_thighhighs, cum_overflow, full-package_futanari, reverse_cowgirl_position, stomach, thighs, torso_grab, veiny_penis |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blue_bowtie | blue_leotard | brown_pantyhose | cleavage | detached_collar | elbow_gloves | highleg_leotard | looking_at_viewer | official_alternate_costume | playboy_bunny | solo | strapless_leotard | thighband_pantyhose | traditional_bowtie | white_gloves | blush | grin | black_pantyhose | sitting | blonde_hair | simple_background | white_background | heart | alternate_costume | cosplay | holding_pom_poms | millennium_cheerleader_outfit_(blue_archive) | crop_top | navel | stomach | white_skirt | long_bangs | midriff | miniskirt | pleated_skirt | armpits | one_eye_covered | maid_headdress | white_apron | black_dress | puffy_short_sleeves | white_thighhighs | choker | maid_apron | frilled_apron | garter_straps | collared_shirt | school_uniform | white_shirt | black_skirt | wrist_scrunchie | black_choker | blue_nails | nail_polish | blue_scrunchie | shirt_tucked_in | button_gap | blue_halo | collarbone | eyes_visible_through_hair | school_bag | thighs | black_shorts | blue_dress | camisole | dolphin_shorts | indoors | short_shorts | outdoors | blue_sky | day | blue_bikini | side-tie_bikini_bottom | sideboob | water | halterneck | ocean | beach | open_mouth | string_bikini | obi | wide_sleeves | floral_print | holding | blue_kimono | hair_bun | long_sleeves | print_kimono | white_kimono | yukata | interracial | nude | 1boy | dark-skinned_male | nipples | sex | solo_focus | erection | large_penis | testicles | anal | cum_in_ass | ejaculating_while_penetrated | futa_with_male | hetero | spread_legs | sweat | uncensored | cum_overflow | full-package_futanari | reverse_cowgirl_position | torso_grab | veiny_penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:------------------|:-----------|:------------------|:---------------|:------------------|:--------------------|:-----------------------------|:----------------|:-------|:--------------------|:----------------------|:---------------------|:---------------|:--------|:-------|:------------------|:----------|:--------------|:--------------------|:-------------------|:--------|:--------------------|:----------|:-------------------|:-----------------------------------------------|:-----------|:--------|:----------|:--------------|:-------------|:----------|:------------|:----------------|:----------|:------------------|:-----------------|:--------------|:--------------|:----------------------|:-------------------|:---------|:-------------|:----------------|:----------------|:-----------------|:-----------------|:--------------|:--------------|:------------------|:---------------|:-------------|:--------------|:-----------------|:------------------|:-------------|:------------|:-------------|:----------------------------|:-------------|:---------|:---------------|:-------------|:-----------|:-----------------|:----------|:---------------|:-----------|:-----------|:------|:--------------|:-------------------------|:-----------|:--------|:-------------|:--------|:--------|:-------------|:----------------|:------|:---------------|:---------------|:----------|:--------------|:-----------|:---------------|:---------------|:---------------|:---------|:--------------|:-------|:-------|:--------------------|:----------|:------|:-------------|:-----------|:--------------|:------------|:-------|:-------------|:-------------------------------|:-----------------|:---------|:--------------|:--------|:-------------|:---------------|:------------------------|:---------------------------|:-------------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | | | X | X | | | X | | | X | | | | | X | X | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 27 |  |  |  |  |  | X | | X | | | X | | X | | X | | | X | | | | X | X | X | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 26 |  |  |  |  |  | X | | X | | | | | | | X | | | X | | | | | X | X | | | | X | X | | | | | | | | | | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | X | | | X | | | | X | | | X | | | | | X | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | | | X | | | | X | | | X | | | | | X | X | | X | X | | | | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | X | | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | X | | | | X | | | | X | | | X | | | | | X | X | | | X | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Den4ikAI/ruWikiHow_instructions | ---
license: mit
---
|
heliosprime/twitter_dataset_1713219845 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20691
num_examples: 54
download_size: 19877
dataset_size: 20691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713219845"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
intfloat/multilingual_cc_news | ---
size_categories:
- 100M<n<1B
language:
- en
- zh
- fr
- de
- af
- ar
---
### Dataset Summary
This dataset is based on [CloverSearch/cc-news-mutlilingual](https://huggingface.co/datasets/CloverSearch/cc-news-mutlilingual).
We add a script to support access multilingual CC-News dataset with HuggingFace datasets API instead of directly downloading raw data files.
### Data Fields
- `title`: a `string` feature.
- `maintext`: a `string` feature.
- `url`: a `string` feature.
- `date_publish`: a `string` feature.
### How to use this dataset
You can load any subset of CC-News per language:
```python
from datasets import load_dataset
dataset = load_dataset("intfloat/multilingual_cc_news", languages=["af"])
```
## Supported Languages
```
af
als
am
an
ar
arz
as
ast
av
az
azb
ba
bar
bcl
be
bg
bh
bn
bo
bpy
br
bs
bxr
ca
cbk
ce
ceb
ckb
co
cs
cv
cy
da
de
diq
dsb
dty
dv
el
eml
en
eo
es
et
eu
fa
fi
fr
fy
ga
gd
gl
gn
gom
gu
gv
he
hi
hif
hr
hsb
ht
hu
hy
ia
id
ie
ilo
io
is
it
ja
jbo
jv
ka
kk
km
kn
ko
krc
ku
kv
kw
ky
la
lb
lez
li
lmo
lo
lt
lv
mai
mg
mhr
min
mk
ml
mn
mr
mrj
ms
mt
mwl
my
myv
mzn
nah
nap
nds
ne
new
nl
nn
no
oc
or
os
pa
pam
pfl
pl
pms
pnb
ps
pt
qu
rm
ro
ru
sa
sah
sc
scn
sco
sd
sh
si
sk
sl
so
sq
sr
su
sv
sw
ta
te
tg
th
tk
tl
tr
tt
tyv
ug
uk
ur
uz
vec
vep
vi
vls
vo
wa
war
wuu
xal
xmf
yi
yo
yue
zh
```
|
DopeorNope/hermes_removed_v2 | ---
dataset_info:
features:
- name: system
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 27103672
num_examples: 30382
download_size: 15525082
dataset_size: 27103672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/wikiclir_sv | ---
pretty_name: '`wikiclir/sv`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/sv`
The `wikiclir/sv` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/sv).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=3,785,412
- `queries` (i.e., topics); count=639,073
- `qrels`: (relevance assessments); count=2,069,453
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_sv', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_sv', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_sv', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k | ---
pretty_name: Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4847120233306423,\n\
\ \"acc_stderr\": 0.03527399847085323,\n \"acc_norm\": 0.4884455010512822,\n\
\ \"acc_norm_stderr\": 0.035257414280301984,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n\
\ \"mc2_stderr\": 0.015890639542177364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182524\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6196972714598685,\n\
\ \"acc_stderr\": 0.004844690404713595,\n \"acc_norm\": 0.8024297948615814,\n\
\ \"acc_norm_stderr\": 0.0039735233080143454\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5645161290322581,\n \"acc_stderr\": 0.028206225591502737,\n \"\
acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.028206225591502737\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971011,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971011\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6623853211009174,\n \"acc_stderr\": 0.02027526598663891,\n \"\
acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.02027526598663891\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.016808322261740467,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.016808322261740467\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.47109826589595377,\n \"acc_stderr\": 0.02687408588351835,\n\
\ \"acc_norm\": 0.47109826589595377,\n \"acc_norm_stderr\": 0.02687408588351835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850407,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850407\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683393,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683393\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650154,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854924,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.02007942040808792,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.02007942040808792\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n\
\ \"mc2_stderr\": 0.015890639542177364\n }\n}\n```"
repo_url: https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- results_2023-08-18T00:53:27.654117.parquet
- split: latest
path:
- results_2023-08-18T00:53:27.654117.parquet
---
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4847120233306423,
"acc_stderr": 0.03527399847085323,
"acc_norm": 0.4884455010512822,
"acc_norm_stderr": 0.035257414280301984,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5103220670450638,
"mc2_stderr": 0.015890639542177364
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182524
},
"harness|hellaswag|10": {
"acc": 0.6196972714598685,
"acc_stderr": 0.004844690404713595,
"acc_norm": 0.8024297948615814,
"acc_norm_stderr": 0.0039735233080143454
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502737,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502737
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.03804913653971011,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.03804913653971011
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6623853211009174,
"acc_stderr": 0.02027526598663891,
"acc_norm": 0.6623853211009174,
"acc_norm_stderr": 0.02027526598663891
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740467,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740467
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47109826589595377,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.47109826589595377,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850407,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.02776768960683393,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.02776768960683393
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650154,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854924,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.02007942040808792,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.02007942040808792
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979034,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979034
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479637,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479637
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5103220670450638,
"mc2_stderr": 0.015890639542177364
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ChayanM/MIMIC-Impression-Only-Data-Train-Test | ---
dataset_info:
features:
- name: File_Path
dtype: string
- name: Impression
dtype: string
splits:
- name: train
num_bytes: 42392266.245605685
num_examples: 195650
- name: test
num_bytes: 2231308.7543943133
num_examples: 10298
download_size: 21665425
dataset_size: 44623575.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
eduardo852/NTL | ---
dataset_info:
features:
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
- name: search_keywords
dtype: 'null'
splits:
- name: train
num_bytes: 888250
num_examples: 5001
download_size: 274554
dataset_size: 888250
---
# Dataset Card for "NTL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lizziepika/strava_activities_runs | ---
license: mit
---
|
psyche/instructions | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 67454649
num_examples: 112104
- name: validation
num_bytes: 7528895
num_examples: 12429
download_size: 43318862
dataset_size: 74983544
---
# Dataset Card for "instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/codegeass | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Code Geass
This is the image base of bangumi Code Geass, we detected 136 characters, 10361 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 37 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 97 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 119 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 187 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 218 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 131 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 77 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 128 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 79 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 42 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 31 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 39 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 13 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 42 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 52 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 89 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 79 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 46 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 75 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 82 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 28 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 21 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 51 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 23 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 26 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 44 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 1363 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 21 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 31 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 109 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 20 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 16 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 178 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 26 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 778 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 16 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 44 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 61 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 71 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 14 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 116 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 20 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 20 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 113 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 298 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 19 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 43 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 141 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 13 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 23 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 48 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 20 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 36 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 19 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 14 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 16 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 17 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 90 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 33 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 17 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 27 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 197 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 19 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 43 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 591 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 44 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 73 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 60 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 151 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 22 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 20 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 74 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 20 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 54 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 26 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 28 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 30 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 14 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 13 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 55 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 12 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 165 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 11 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 185 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 72 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 9 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 32 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 39 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 120 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 126 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 18 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 44 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 10 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 6 | [Download](93/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 94 | 43 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 207 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 12 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 11 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 15 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 17 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 20 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 9 | [Download](101/dataset.zip) |  |  |  |  |  |  |  |  |
| 102 | 253 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 10 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 16 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 28 | [Download](105/dataset.zip) |  |  |  |  |  |  |  |  |
| 106 | 19 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 9 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 17 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 12 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 7 | [Download](110/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 111 | 11 | [Download](111/dataset.zip) |  |  |  |  |  |  |  |  |
| 112 | 20 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 17 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 10 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 9 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 22 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 308 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 423 | [Download](118/dataset.zip) |  |  |  |  |  |  |  |  |
| 119 | 19 | [Download](119/dataset.zip) |  |  |  |  |  |  |  |  |
| 120 | 7 | [Download](120/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 121 | 9 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 8 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 114 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 88 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 5 | [Download](125/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 126 | 10 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 14 | [Download](127/dataset.zip) |  |  |  |  |  |  |  |  |
| 128 | 8 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| 129 | 5 | [Download](129/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 130 | 13 | [Download](130/dataset.zip) |  |  |  |  |  |  |  |  |
| 131 | 14 | [Download](131/dataset.zip) |  |  |  |  |  |  |  |  |
| 132 | 7 | [Download](132/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 133 | 30 | [Download](133/dataset.zip) |  |  |  |  |  |  |  |  |
| 134 | 7 | [Download](134/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 348 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
novaia/world-heightmaps-256px | ---
license: apache-2.0
task_categories:
- image-classification
- text-to-image
- unconditional-image-generation
size_categories:
- 100K<n<1M
---
# World Heightmaps 256px
This is a dataset of 256x256 Earth heightmaps generated from [SRTM 1 Arc-Second Global](https://huggingface.co/datasets/hayden-donnelly/srtm-1-arc-second-global).
Each heightmap is labelled according to its latitude and longitude. There are 573,995 samples. It is the same as
[World Heightmaps 360px](https://huggingface.co/datasets/novaia/world-heightmaps-360px) but downsampled to 256x256.
## Method
1. Convert GeoTIFFs into PNGs with Rasterio.
```python
import rasterio
import matplotlib.pyplot as plt
import os
input_directory = '...'
output_directory = '...'
file_list = os.listdir(input_directory)
for i in range(len(file_list)):
image = rasterio.open(input_directory + file_list[i])
plt.imsave(output_directory + file_list[i][0:-4] + '.png', image.read(1), cmap='gray')
```
2. Split PNGs into 100 patches with Split Image.
```python
from split_image import split_image
import os
input_directory = '...'
output_directory = '...'
file_list = os.listdir(input_directory)
for i in range(len(file_list)):
split_image(input_directory + file_list[i], 10, 10, should_square=True, should_cleanup=False, output_dir=output_directory)
```
3. Hand pick a dataset of corrupted and uncorrupted heightmaps then train a discriminator to automatically filter the whole dataset.
4. Downsample from 360x360 to 256x256 with Pillow and the Lanczos resampling method.
```python
import glob
from PIL import Image
paths = glob.glob('world-heightmaps-360px-png/data/*/*')
for file_name in paths:
image = Image.open(file_name)
if image.width == 256:
continue
print(file_name)
image = image.resize((256, 256), resample=Image.LANCZOS)
image.save(file_name)
```
5. Compile images into parquet files.
```python
import pyarrow as pa
import pyarrow.parquet as pq
import pandas as pd
from PIL import Image
import os
import io
import json
samples_per_file = 10_000
root_dir = 'data/datasets/world-heightmaps-256px-png'
df = pd.read_csv(os.path.join(root_dir, 'metadata.csv'))
df = df.sample(frac=1).reset_index(drop=True)
def save_table(image_data, table_number):
print(f'Entries in table {table_number}: {len(image_data)}')
schema = pa.schema(
fields=[
('heightmap', pa.struct([('bytes', pa.binary()), ('path', pa.string())])),
('latitude', pa.string()),
('longitude', pa.string())
],
metadata={
b'huggingface': json.dumps({
'info': {
'features': {
'heightmap': {'_type': 'Image'},
'latitude': {'_type': 'Value', 'dtype': 'string'},
'longitude': {'_type': 'Value', 'dtype': 'string'}
}
}
}).encode('utf-8')
}
)
table = pa.Table.from_pylist(image_data, schema=schema)
pq.write_table(table, f'data/world-heightmaps-256px-parquet/{str(table_number).zfill(4)}.parquet')
image_data = []
samples_in_current_file = 0
current_file_number = 0
for i, row in df.iterrows():
if samples_in_current_file >= samples_per_file:
save_table(image_data, current_file_number)
image_data = []
samples_in_current_file = 0
current_file_number += 1
samples_in_current_file += 1
image_path = row['file_name']
with Image.open(os.path.join(root_dir, image_path)) as image:
image_bytes = io.BytesIO()
image.save(image_bytes, format='PNG')
image_dict = {
'heightmap': {
'bytes': image_bytes.getvalue(),
'path': image_path
},
'latitude': str(row['latitude']),
'longitude': str(row['longitude'])
}
image_data.append(image_dict)
save_table(image_data, current_file_number)
``` |
scene-genie/instagram-dataset-frr | ---
dataset_info:
features:
- name: user
dtype: string
- name: image_id
dtype: int64
- name: original_image_path
dtype: string
- name: original_image
dtype: image
- name: langsam_res
dtype: string
- name: caption
dtype: string
- name: brand
dtype: string
- name: quality
dtype: string
- name: lifestyle
dtype: bool
- name: product
dtype: bool
- name: text
dtype: bool
- name: frr_image
dtype: image
splits:
- name: train
num_bytes: 464917769.748
num_examples: 1774
download_size: 463202785
dataset_size: 464917769.748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nostradamus89/1C_code_1000 | ---
license: mit
---
|
bharadwajkg/planogram-sample-1024 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1491395.0
num_examples: 10
download_size: 1492362
dataset_size: 1491395.0
---
# Dataset Card for "planogram-sample-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
justinphan3110/harmful_harmless_instructions | ---
dataset_info:
features:
- name: sentence
sequence: string
- name: label
sequence: bool
splits:
- name: train
num_bytes: 20636
num_examples: 128
- name: test
num_bytes: 62788
num_examples: 384
download_size: 49815
dataset_size: 83424
---
# Dataset Card for "harmful_harmless_instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DLI-Lab/code-dpo-classification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: description
dtype: string
- name: index
dtype: int64
- name: invaluabe_feedback
dtype: string
- name: wrong_code
dtype: string
- name: valuabe_feedback
dtype: string
splits:
- name: train
num_bytes: 27040658
num_examples: 17140
- name: eval
num_bytes: 2998200
num_examples: 1904
download_size: 9705149
dataset_size: 30038858
---
# Dataset Card for "code-dpo-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_eu_ted_talks_iwslt | ---
language: eu
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_eu_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
enelpe/testitest | ---
dataset_info:
features:
- name: Sentences
sequence: string
- name: Labels
sequence: int64
splits:
- name: train
num_bytes: 7587838
num_examples: 23196
download_size: 0
dataset_size: 7587838
---
# Dataset Card for "testitest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ckandemir/bitcoin_tweets_sentiment_kaggle | ---
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- kaggle
task_categories:
- text-classification
task_ids:
- sentiment-classification
tags:
- datasets
- bitcoin
- text-classification
- sentiment-analysis
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: Date
dtype: string
- name: text
dtype: string
- name: Sentiment
dtype: string
splits:
- name: train
num_bytes: 12842246
num_examples: 77791
- name: test
num_bytes: 1609120
num_examples: 9724
- name: eval
num_bytes: 1598297
num_examples: 9724
download_size: 9868625
dataset_size: 16049663
---
# Dataset Card for "Bitcoin Tweets"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Dataset Distribution](#dataset-distribution)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
### Dataset Summary
This dataset contains a collection of 16 million tweets related to Bitcoin, collected from Twitter. Each tweet is tagged with sentiment (positive, negative, neutral). The dataset was originally created and uploaded to Kaggle by user gauravduttakiit. It is a valuable resource for training and evaluating models for sentiment analysis within the context of cryptocurrency discussions.
### Supported Tasks and Leaderboards
- `text-classification`: This dataset can be used to train a model for sentiment analysis. The performance of the model can be evaluated using standard metrics like accuracy, F1 score, precision, and recall.
### Languages
The text data is primarily in English.
## Dataset Structure
### Data Instances
Each instance in the dataset contains the following fields:
- `tweet`: the text of the tweet.
- `sentiment`: the sentiment of the tweet, labeled as either "positive", "negative", or "neutral".
### Data Fields
- `tweet`: a string containing the text of the tweet.
- `sentiment`: a string indicating the sentiment of the tweet.
### Data Splits
The dataset is not explicitly split into training, validation, and test sets. Users will need to create these splits as per their requirements.
## Dataset Creation
### Curation Rationale
The dataset was curated to analyze the sentiment within the cryptocurrency community, specifically focusing on Bitcoin.
### Source Data
#### Initial Data Collection and Normalization
The data was collected from Twitter using specific keywords related to Bitcoin. For more details regarding data collection, one can refer to the [original Kaggle dataset](https://www.kaggle.com/datasets/gauravduttakiit/bitcoin-tweets-16m-tweets-with-sentiment-tagged).
#### Who are the source data providers?
The data was provided by Kaggle user gauravduttakiit.
### Annotations
The sentiment labels were generated using automated sentiment analysis tools. For more details, refer to the [original Kaggle dataset](https://www.kaggle.com/datasets/gauravduttakiit/bitcoin-tweets-16m-tweets-with-sentiment-tagged).
## Dataset Distribution
### Dataset Curators
The dataset was curated by gauravduttakiit and uploaded to Kaggle.
### Licensing Information
Refer to the [original Kaggle dataset](https://www.kaggle.com/datasets/gauravduttakiit/bitcoin-tweets-16m-tweets-with-sentiment-tagged) for licensing information. |
Mohammed-Altaf/medical-instruction-100k | ---
license: mit
language:
- en
tags:
- medi
- medical
pretty_name: python
size_categories:
- 10K<n<100K
---
# What is the Dataset About?🤷🏼♂️
---
The dataset is useful for training a Generative Language Model for the Medical application and instruction purposes, the dataset consists of various thoughs proposed by the people [**mentioned as the Human** ] and there responses including Medical Terminologies not limited to but including names of the drugs, prescriptions, yogic exercise suggessions, breathing exercise suggessions and few natural home made prescriptions.
# How the Dataset was made?😅
---
I have used all the available opensource datasets and combined them into a single datsource for training, which is completely opensourced and somewhat reliable.
* There is another refined and updated version of this datset here 👉🏼 [Link](https://huggingface.co/datasets/Mohammed-Altaf/medical-instruction-120k)
## Example Training Scripts:
* Qlora Fine Tuning -
## Tips:
This is my first dataset to upload on HuggingFace, so below are the thing I wish I could have known
* always save your final dataset before uploading to hub as a json with lines.
* The json should have the records orientation, which will be helpful while loading the dataset properly without any error.
```{python}
# use below if you are using pandas for data manipulation
train.to_json("dataset_name.json", orient='records', lines=True)
test.to_json("dataset_name.json", orient='records', lines=True)
``` |
yurifacanha/NCM-dataset | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 107883250
num_examples: 239226
download_size: 7779099
dataset_size: 107883250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ismaelvillanuevamiranda/Colo_Rectal_DatasetQA_parsed_finetune | ---
license: mit
---
|
open-llm-leaderboard/details_KoboldAI__OPT-13B-Nerys-v2 | ---
pretty_name: Evaluation run of KoboldAI/OPT-13B-Nerys-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/OPT-13B-Nerys-v2](https://huggingface.co/KoboldAI/OPT-13B-Nerys-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__OPT-13B-Nerys-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T00:09:36.739162](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-13B-Nerys-v2/blob/main/results_2023-10-22T00-09-36.739162.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893118994,\n \"f1\": 0.052374161073825636,\n\
\ \"f1_stderr\": 0.001268760566280153,\n \"acc\": 0.3405215977041276,\n\
\ \"acc_stderr\": 0.007217878569712872\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118994,\n\
\ \"f1\": 0.052374161073825636,\n \"f1_stderr\": 0.001268760566280153\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6787687450670876,\n \"acc_stderr\": 0.013123599324558317\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/OPT-13B-Nerys-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T00_09_36.739162
path:
- '**/details_harness|drop|3_2023-10-22T00-09-36.739162.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T00-09-36.739162.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T00_09_36.739162
path:
- '**/details_harness|gsm8k|5_2023-10-22T00-09-36.739162.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T00-09-36.739162.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:46:37.808962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:46:37.808962.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:46:37.808962.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T00_09_36.739162
path:
- '**/details_harness|winogrande|5_2023-10-22T00-09-36.739162.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T00-09-36.739162.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_46_37.808962
path:
- results_2023-07-19T18:46:37.808962.parquet
- split: 2023_10_22T00_09_36.739162
path:
- results_2023-10-22T00-09-36.739162.parquet
- split: latest
path:
- results_2023-10-22T00-09-36.739162.parquet
---
# Dataset Card for Evaluation run of KoboldAI/OPT-13B-Nerys-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/OPT-13B-Nerys-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/OPT-13B-Nerys-v2](https://huggingface.co/KoboldAI/OPT-13B-Nerys-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__OPT-13B-Nerys-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T00:09:36.739162](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-13B-Nerys-v2/blob/main/results_2023-10-22T00-09-36.739162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118994,
"f1": 0.052374161073825636,
"f1_stderr": 0.001268760566280153,
"acc": 0.3405215977041276,
"acc_stderr": 0.007217878569712872
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118994,
"f1": 0.052374161073825636,
"f1_stderr": 0.001268760566280153
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674266
},
"harness|winogrande|5": {
"acc": 0.6787687450670876,
"acc_stderr": 0.013123599324558317
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_100000_heloc_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2364400000
num_examples: 100000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 447651417
dataset_size: 2600840000
---
# Dataset Card for "autotree_automl_100000_heloc_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nayohan/book_summary | ---
dataset_info:
features:
- name: subject
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 265542056
num_examples: 180001
download_size: 135548707
dataset_size: 265542056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/10026758 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1326
dataset_size: 176
---
# Dataset Card for "10026758"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B | ---
pretty_name: Evaluation run of RobbeD/OpenLlama-Platypus-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RobbeD/OpenLlama-Platypus-3B](https://huggingface.co/RobbeD/OpenLlama-Platypus-3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T06:28:14.000432](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B/blob/main/results_2023-09-23T06-28-14.000432.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06145134228187919,\n\
\ \"em_stderr\": 0.002459425856611146,\n \"f1\": 0.11012269295302003,\n\
\ \"f1_stderr\": 0.002656818706713483,\n \"acc\": 0.3355993065948289,\n\
\ \"acc_stderr\": 0.008117942480603072\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06145134228187919,\n \"em_stderr\": 0.002459425856611146,\n\
\ \"f1\": 0.11012269295302003,\n \"f1_stderr\": 0.002656818706713483\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.0029206661987887473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.65982636148382,\n \"acc_stderr\": 0.013315218762417397\n\
\ }\n}\n```"
repo_url: https://huggingface.co/RobbeD/OpenLlama-Platypus-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|drop|3_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-28-14.000432.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-28-14.000432.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|winogrande|5_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-28-14.000432.parquet'
- config_name: results
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- results_2023-08-29T10:12:53.419020.parquet
- split: 2023_09_23T06_28_14.000432
path:
- results_2023-09-23T06-28-14.000432.parquet
- split: latest
path:
- results_2023-09-23T06-28-14.000432.parquet
---
# Dataset Card for Evaluation run of RobbeD/OpenLlama-Platypus-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RobbeD/OpenLlama-Platypus-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RobbeD/OpenLlama-Platypus-3B](https://huggingface.co/RobbeD/OpenLlama-Platypus-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:28:14.000432](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B/blob/main/results_2023-09-23T06-28-14.000432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06145134228187919,
"em_stderr": 0.002459425856611146,
"f1": 0.11012269295302003,
"f1_stderr": 0.002656818706713483,
"acc": 0.3355993065948289,
"acc_stderr": 0.008117942480603072
},
"harness|drop|3": {
"em": 0.06145134228187919,
"em_stderr": 0.002459425856611146,
"f1": 0.11012269295302003,
"f1_stderr": 0.002656818706713483
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887473
},
"harness|winogrande|5": {
"acc": 0.65982636148382,
"acc_stderr": 0.013315218762417397
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FaalSa/dfaas | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 172899
num_examples: 3
- name: validation
num_bytes: 174339
num_examples: 3
- name: test
num_bytes: 175779
num_examples: 3
download_size: 89880
dataset_size: 523017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Rusallan/Word2Vec | ---
license: unknown
---
|
open-llm-leaderboard/details_Undi95__Clover3-17B | ---
pretty_name: Evaluation run of Undi95/Clover3-17B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Clover3-17B](https://huggingface.co/Undi95/Clover3-17B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Clover3-17B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T06:10:19.622221](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Clover3-17B/blob/main/results_2023-12-12T06-10-19.622221.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.600361059236723,\n\
\ \"acc_stderr\": 0.033074807649830854,\n \"acc_norm\": 0.608082187606879,\n\
\ \"acc_norm_stderr\": 0.03379934177123045,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.01550620472283456,\n \"mc2\": 0.4072173688663445,\n\
\ \"mc2_stderr\": 0.014502556892504742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6161123282214698,\n\
\ \"acc_stderr\": 0.004853371646239244,\n \"acc_norm\": 0.811790479984067,\n\
\ \"acc_norm_stderr\": 0.003900805416736722\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153327,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153327\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.01653061740926687,\n \"acc_norm\"\
: 0.818348623853211,\n \"acc_norm_stderr\": 0.01653061740926687\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n\
\ \"acc_stderr\": 0.034028015813589656,\n \"acc_norm\": 0.4675925925925926,\n\
\ \"acc_norm_stderr\": 0.034028015813589656\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381398,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381398\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20446927374301677,\n\
\ \"acc_stderr\": 0.013488813404711919,\n \"acc_norm\": 0.20446927374301677,\n\
\ \"acc_norm_stderr\": 0.013488813404711919\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963046,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963046\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612506,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612506\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.01550620472283456,\n \"mc2\": 0.4072173688663445,\n\
\ \"mc2_stderr\": 0.014502556892504742\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090259\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18802122820318423,\n \
\ \"acc_stderr\": 0.010762621695354888\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Clover3-17B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|arc:challenge|25_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|gsm8k|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hellaswag|10_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-10-19.622221.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T06-10-19.622221.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- '**/details_harness|winogrande|5_2023-12-12T06-10-19.622221.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T06-10-19.622221.parquet'
- config_name: results
data_files:
- split: 2023_12_12T06_10_19.622221
path:
- results_2023-12-12T06-10-19.622221.parquet
- split: latest
path:
- results_2023-12-12T06-10-19.622221.parquet
---
# Dataset Card for Evaluation run of Undi95/Clover3-17B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Undi95/Clover3-17B](https://huggingface.co/Undi95/Clover3-17B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Clover3-17B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T06:10:19.622221](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Clover3-17B/blob/main/results_2023-12-12T06-10-19.622221.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.600361059236723,
"acc_stderr": 0.033074807649830854,
"acc_norm": 0.608082187606879,
"acc_norm_stderr": 0.03379934177123045,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.01550620472283456,
"mc2": 0.4072173688663445,
"mc2_stderr": 0.014502556892504742
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6161123282214698,
"acc_stderr": 0.004853371646239244,
"acc_norm": 0.811790479984067,
"acc_norm_stderr": 0.003900805416736722
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764826,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153327,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153327
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.01653061740926687,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.01653061740926687
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.034028015813589656,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.034028015813589656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381398,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711919,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711919
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963046,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963046
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612506,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612506
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.01550620472283456,
"mc2": 0.4072173688663445,
"mc2_stderr": 0.014502556892504742
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090259
},
"harness|gsm8k|5": {
"acc": 0.18802122820318423,
"acc_stderr": 0.010762621695354888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yzhuang/metatree_pc3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 343492
num_examples: 1087
- name: validation
num_bytes: 150416
num_examples: 476
download_size: 280604
dataset_size: 493908
---
# Dataset Card for "metatree_pc3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Inv__Elbrus-7B | ---
pretty_name: Evaluation run of Inv/Elbrus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/Elbrus-7B](https://huggingface.co/Inv/Elbrus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Elbrus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T19:03:39.908723](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Elbrus-7B/blob/main/results_2024-03-27T19-03-39.908723.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.629742693636363,\n\
\ \"acc_stderr\": 0.03242348568425567,\n \"acc_norm\": 0.634258914688389,\n\
\ \"acc_norm_stderr\": 0.033077083630482444,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.44344840804142965,\n\
\ \"mc2_stderr\": 0.0145714765205392\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693026,\n\
\ \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585188\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.643397729535949,\n\
\ \"acc_stderr\": 0.004780169873332852,\n \"acc_norm\": 0.839573790081657,\n\
\ \"acc_norm_stderr\": 0.003662508272330895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750066,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750066\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.0235329254310443,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.0235329254310443\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n\
\ \"acc_stderr\": 0.016399716732847135,\n \"acc_norm\": 0.4022346368715084,\n\
\ \"acc_norm_stderr\": 0.016399716732847135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398864,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398864\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.44344840804142965,\n\
\ \"mc2_stderr\": 0.0145714765205392\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42532221379833207,\n \
\ \"acc_stderr\": 0.01361800636308479\n }\n}\n```"
repo_url: https://huggingface.co/Inv/Elbrus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|arc:challenge|25_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|gsm8k|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hellaswag|10_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-03-39.908723.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T19-03-39.908723.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- '**/details_harness|winogrande|5_2024-03-27T19-03-39.908723.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T19-03-39.908723.parquet'
- config_name: results
data_files:
- split: 2024_03_27T19_03_39.908723
path:
- results_2024-03-27T19-03-39.908723.parquet
- split: latest
path:
- results_2024-03-27T19-03-39.908723.parquet
---
# Dataset Card for Evaluation run of Inv/Elbrus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Elbrus-7B](https://huggingface.co/Inv/Elbrus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Elbrus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T19:03:39.908723](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Elbrus-7B/blob/main/results_2024-03-27T19-03-39.908723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.629742693636363,
"acc_stderr": 0.03242348568425567,
"acc_norm": 0.634258914688389,
"acc_norm_stderr": 0.033077083630482444,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.44344840804142965,
"mc2_stderr": 0.0145714765205392
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693026,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.014027516814585188
},
"harness|hellaswag|10": {
"acc": 0.643397729535949,
"acc_stderr": 0.004780169873332852,
"acc_norm": 0.839573790081657,
"acc_norm_stderr": 0.003662508272330895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750066,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750066
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266875,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266875
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.0235329254310443,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.0235329254310443
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.016399716732847135,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.016399716732847135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.02888819310398864,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.02888819310398864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.44344840804142965,
"mc2_stderr": 0.0145714765205392
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936657
},
"harness|gsm8k|5": {
"acc": 0.42532221379833207,
"acc_stderr": 0.01361800636308479
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Saads/xecanto_birds | ---
license: mit
---
|
huggingface/autotrain-data-torch1 | Invalid username or password. |
open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B | ---
pretty_name: Evaluation run of jan-hq/LlamaCorn-1.1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/LlamaCorn-1.1B](https://huggingface.co/jan-hq/LlamaCorn-1.1B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T02:48:10.552865](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B/blob/main/results_2024-01-17T02-48-10.552865.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29375199116706574,\n\
\ \"acc_stderr\": 0.03225608414226124,\n \"acc_norm\": 0.29607425614190314,\n\
\ \"acc_norm_stderr\": 0.03309063417483788,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.0147891575310805,\n \"mc2\": 0.3677529114898043,\n\
\ \"mc2_stderr\": 0.013980681587593108\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3148464163822526,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.3412969283276451,\n \"acc_norm_stderr\": 0.013855831287497723\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44612626966739694,\n\
\ \"acc_stderr\": 0.004960732382255234,\n \"acc_norm\": 0.5933081059549891,\n\
\ \"acc_norm_stderr\": 0.004902125388002216\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.03492349668884239,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.03492349668884239\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745657,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745657\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.34545454545454546,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.34545454545454546,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.26262626262626265,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877795,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877795\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634342,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634342\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416413,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416413\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3924050632911392,\n \"acc_stderr\": 0.03178471874564729,\n \
\ \"acc_norm\": 0.3924050632911392,\n \"acc_norm_stderr\": 0.03178471874564729\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4125560538116592,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.4125560538116592,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.040933292298342784,\n\
\ \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.040933292298342784\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.043457245702925355,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.043457245702925355\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.35185185185185186,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.35185185185185186,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3547008547008547,\n\
\ \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.3547008547008547,\n\
\ \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.32567049808429116,\n\
\ \"acc_stderr\": 0.01675798945854968,\n \"acc_norm\": 0.32567049808429116,\n\
\ \"acc_norm_stderr\": 0.01675798945854968\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.315028901734104,\n \"acc_stderr\": 0.0250093137900697,\n\
\ \"acc_norm\": 0.315028901734104,\n \"acc_norm_stderr\": 0.0250093137900697\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364553,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364553\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826507,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826507\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2993827160493827,\n \"acc_stderr\": 0.025483115601195466,\n\
\ \"acc_norm\": 0.2993827160493827,\n \"acc_norm_stderr\": 0.025483115601195466\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229428,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145294,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.032038410402133226,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.032038410402133226\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.036155076303109344,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.036155076303109344\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.0147891575310805,\n \"mc2\": 0.3677529114898043,\n\
\ \"mc2_stderr\": 0.013980681587593108\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6195737963693765,\n \"acc_stderr\": 0.013644727908656833\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/LlamaCorn-1.1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|arc:challenge|25_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|gsm8k|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hellaswag|10_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T02-48-10.552865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T02-48-10.552865.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- '**/details_harness|winogrande|5_2024-01-17T02-48-10.552865.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T02-48-10.552865.parquet'
- config_name: results
data_files:
- split: 2024_01_17T02_48_10.552865
path:
- results_2024-01-17T02-48-10.552865.parquet
- split: latest
path:
- results_2024-01-17T02-48-10.552865.parquet
---
# Dataset Card for Evaluation run of jan-hq/LlamaCorn-1.1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/LlamaCorn-1.1B](https://huggingface.co/jan-hq/LlamaCorn-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T02:48:10.552865](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B/blob/main/results_2024-01-17T02-48-10.552865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29375199116706574,
"acc_stderr": 0.03225608414226124,
"acc_norm": 0.29607425614190314,
"acc_norm_stderr": 0.03309063417483788,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.0147891575310805,
"mc2": 0.3677529114898043,
"mc2_stderr": 0.013980681587593108
},
"harness|arc:challenge|25": {
"acc": 0.3148464163822526,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.3412969283276451,
"acc_norm_stderr": 0.013855831287497723
},
"harness|hellaswag|10": {
"acc": 0.44612626966739694,
"acc_stderr": 0.004960732382255234,
"acc_norm": 0.5933081059549891,
"acc_norm_stderr": 0.004902125388002216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.03492349668884239,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.03492349668884239
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745657,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745657
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877795,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877795
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.2794871794871795,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634342,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634342
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416413,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416413
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656628,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656628
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3924050632911392,
"acc_stderr": 0.03178471874564729,
"acc_norm": 0.3924050632911392,
"acc_norm_stderr": 0.03178471874564729
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4125560538116592,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.4125560538116592,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.043457245702925355,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.043457245702925355
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3547008547008547,
"acc_stderr": 0.03134250486245402,
"acc_norm": 0.3547008547008547,
"acc_norm_stderr": 0.03134250486245402
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.32567049808429116,
"acc_stderr": 0.01675798945854968,
"acc_norm": 0.32567049808429116,
"acc_norm_stderr": 0.01675798945854968
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.315028901734104,
"acc_stderr": 0.0250093137900697,
"acc_norm": 0.315028901734104,
"acc_norm_stderr": 0.0250093137900697
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364553,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364553
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826507,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826507
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2993827160493827,
"acc_stderr": 0.025483115601195466,
"acc_norm": 0.2993827160493827,
"acc_norm_stderr": 0.025483115601195466
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.01099615663514269,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.01099615663514269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229428,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145294,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.032038410402133226,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.032038410402133226
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.0147891575310805,
"mc2": 0.3677529114898043,
"mc2_stderr": 0.013980681587593108
},
"harness|winogrande|5": {
"acc": 0.6195737963693765,
"acc_stderr": 0.013644727908656833
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
one-sec-cv12/chunk_87 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 25394322816.0
num_examples: 264392
download_size: 23210049393
dataset_size: 25394322816.0
---
# Dataset Card for "chunk_87"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Asad321/MKBHD-KsWAdpsB-scraped-data-Final-Evaluation-Demo | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1439
num_examples: 3
download_size: 4958
dataset_size: 1439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MKBHD-KsWAdpsB-scraped-data-Final-Evaluation-Demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/openhermes-dev__mistralai_Mistral-7B-Instruct-v0.1__1707487539 | ---
dataset_info:
features:
- name: system_prompt
dtype: 'null'
- name: model
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: source
dtype: string
- name: title
dtype: 'null'
- name: topic
dtype: 'null'
- name: skip_prompt_formatting
dtype: bool
- name: idx
dtype: 'null'
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: language
dtype: 'null'
- name: category
dtype: string
- name: id
dtype: 'null'
- name: model_name
dtype: 'null'
- name: prompt
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: token_length
dtype: int64
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 182567.71875
num_examples: 31
- name: test_prefs
num_bytes: 5889.28125
num_examples: 1
download_size: 191234
dataset_size: 188457.0
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
open-llm-leaderboard/details_jan-hq__stealth-v2 | ---
pretty_name: Evaluation run of jan-hq/stealth-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/stealth-v2](https://huggingface.co/jan-hq/stealth-v2) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T18:57:50.959590](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v2/blob/main/results_2024-02-29T18-57-50.959590.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564370425907802,\n\
\ \"acc_stderr\": 0.03194911390660878,\n \"acc_norm\": 0.6550323302405521,\n\
\ \"acc_norm_stderr\": 0.032635665444863765,\n \"mc1\": 0.587515299877601,\n\
\ \"mc1_stderr\": 0.01723329939957121,\n \"mc2\": 0.7247164261385575,\n\
\ \"mc2_stderr\": 0.014632032305129024\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7226962457337884,\n \"acc_stderr\": 0.013082095839059376,\n\
\ \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.01283552390947384\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7277434773949413,\n\
\ \"acc_stderr\": 0.00444211526858093,\n \"acc_norm\": 0.8925512846046604,\n\
\ \"acc_norm_stderr\": 0.003090499801090434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083004,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083004\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525818,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4670391061452514,\n\
\ \"acc_stderr\": 0.016686126653013934,\n \"acc_norm\": 0.4670391061452514,\n\
\ \"acc_norm_stderr\": 0.016686126653013934\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.587515299877601,\n\
\ \"mc1_stderr\": 0.01723329939957121,\n \"mc2\": 0.7247164261385575,\n\
\ \"mc2_stderr\": 0.014632032305129024\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8800315706393055,\n \"acc_stderr\": 0.009131996995678647\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \
\ \"acc_stderr\": 0.012661502663418697\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/stealth-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-57-50.959590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-57-50.959590.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- '**/details_harness|winogrande|5_2024-02-29T18-57-50.959590.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T18-57-50.959590.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_57_50.959590
path:
- results_2024-02-29T18-57-50.959590.parquet
- split: latest
path:
- results_2024-02-29T18-57-50.959590.parquet
---
# Dataset Card for Evaluation run of jan-hq/stealth-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v2](https://huggingface.co/jan-hq/stealth-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T18:57:50.959590](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v2/blob/main/results_2024-02-29T18-57-50.959590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564370425907802,
"acc_stderr": 0.03194911390660878,
"acc_norm": 0.6550323302405521,
"acc_norm_stderr": 0.032635665444863765,
"mc1": 0.587515299877601,
"mc1_stderr": 0.01723329939957121,
"mc2": 0.7247164261385575,
"mc2_stderr": 0.014632032305129024
},
"harness|arc:challenge|25": {
"acc": 0.7226962457337884,
"acc_stderr": 0.013082095839059376,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.01283552390947384
},
"harness|hellaswag|10": {
"acc": 0.7277434773949413,
"acc_stderr": 0.00444211526858093,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083004,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083004
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525818,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4670391061452514,
"acc_stderr": 0.016686126653013934,
"acc_norm": 0.4670391061452514,
"acc_norm_stderr": 0.016686126653013934
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.587515299877601,
"mc1_stderr": 0.01723329939957121,
"mc2": 0.7247164261385575,
"mc2_stderr": 0.014632032305129024
},
"harness|winogrande|5": {
"acc": 0.8800315706393055,
"acc_stderr": 0.009131996995678647
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
thavens/ufb_mini | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: train
num_bytes: 1065422
num_examples: 160
download_size: 604342
dataset_size: 1065422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bjornbundgaard/colabwithtag | ---
license: unknown
---
|
SWHL/table_rec_test_dataset | ---
license: apache-2.0
task_categories:
- translation
language:
- zh
- en
tags:
- code
size_categories:
- n<1K
---
## 表格识别测试集
### 数据集简介
- 数据集包括18张表格的图像,包括拍照类型、截图类型的有线和无线表格。
- 该数据集可以结合[表格指标评测库-TableRecognitionMetric](https://github.com/SWHL/TableRecognitionMetric)使用,快速评测各种表格还原算法。
- **关于该数据集,欢迎小伙伴贡献更多数据呦!有任何想法,可以前往[issue](https://github.com/SWHL/TableRecognitionMetric/issues)讨论。**
### 数据集支持的任务
可用于自定义数据集下的模型验证和性能评估等。
### 数据集的格式和结构
#### 数据格式
数据集只有测试集,仅用于客观评估算法表现。
```text
data
└── test
├── 000cce9ca593055d4618466e823e6d7c.jpg
├── 0aNtiNtRRLqEZ9y6PuShtAAAACMAAQED.jpg
├── 116d6b07ecfdae7721bd6bbf31031c1a.jpg
├── 18bc90cb646c109d22ba44565b9a58bc3095e6d3.jpg
├── 1e7d7fed671a9f9043edd57874ef1b13587afa8d.jpg
├── 20200211182342519549-0.jpg
├── 6a8f24150a396470ab29a5ff29aa959dfe7f1c57.jpg
├── Snipaste_2023-07-05_14-54-25.jpg
├── Snipaste_2023-07-05_14-54-58.jpg
├── Snipaste_2023-07-05_14-58-59.jpg
├── Snipaste_2023-07-05_15-00-55.jpg
├── metadata.jsonl
├── row_span.jpg
├── table2.jpg
├── table3.jpg
├── table4.jpg
├── table6.jpg
├── table7.jpg
└── table_recognition.jpg
```
#### 数据集加载方式
```python
from datasets import load_dataset
dataset = load_dataset("SWHL/table_rec_test_dataset")
test_data = dataset['test']
print(test_data)
```
### 数据集生成的相关信息
#### 原始数据
数据来源于网络,如侵删。
#### 数据集标注
数据集标注为html格式,示例如下:
```text
<html><body><table><tr><td colspan="2">Textln让机器像人类一样理解文字</td></tr><tr><td>Textln产品</td><td>产品描述</td></tr><tr><td>TextinServerAPI文字识别产品</td><td>通用文本识别、表格识别、卡证识别、票据识别、定制识别等识别产品</td></tr><tr><td>TextinMobileSDK图像处理与文字识别SDK</td><td>图像处理,文本、卡证、票据识别和信息提取移动端SDK</td></tr><tr><td>TextlnStudio文字识别训练平台</td><td>OCR自定义模版配置和机器学习训练平台</td></tr><tr><td>Textin企业A/管理平台</td><td>企业AI接入监控统计和渠道业务管理平台</td></tr><tr><td>Textin财报机器人</td><td>财务报表智能分类、识别、提取、匹配、试算产品</td></tr><tr><td>Textin合同比对机器人</td><td>合同多版本差异智能比对产品</td></tr><tr><td>Textin解决方案</td><td>结合客户业务场景和TextIn能力的场景解决方案</td></tr></table></body></html>
```
|
espejelomar/my_embeddings | ---
license: mit
---
|
HydraLM/SkunkData-002 | ---
license: apache-2.0
---
|
shossain/govreport-qa-5-2048 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 133180
num_examples: 5
download_size: 45937
dataset_size: 133180
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "govreport-qa-5-2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_all4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 37000242
num_examples: 40959
download_size: 6763324
dataset_size: 37000242
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_all4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_who_as | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5900
num_examples: 26
- name: test
num_bytes: 3468
num_examples: 13
- name: train
num_bytes: 15592
num_examples: 64
download_size: 28300
dataset_size: 24960
---
# Dataset Card for "MULTI_VALUE_stsb_who_as"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/CSIC_GPT2_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115621178.4375
num_examples: 37500
- name: test
num_bytes: 38540392.5
num_examples: 12500
download_size: 211849176
dataset_size: 154161570.9375
---
# Dataset Card for "CSIC_GPT2_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amaandhada/easter_egg_full | ---
license: apache-2.0
---
|
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_15_loca_7 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 14
num_examples: 1
download_size: 0
dataset_size: 14
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_15_loca_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.