datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jeremyvictor/gramatika1500k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 316739508
num_examples: 1199705
- name: dev
num_bytes: 39645742
num_examples: 150171
- name: test
num_bytes: 39708222
num_examples: 150124
download_size: 265500526
dataset_size: 396093472
---
# Dataset Card for "gramatika1500k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lukemann/baby-agi-dataset-v0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: instruction
dtype: string
- name: trajectory
list:
- name: image_id
dtype: string
- name: action_options
list:
- name: index
dtype: int32
- name: top_left
sequence: int32
- name: bottom_right
sequence: int32
- name: action_taken
struct:
- name: type
dtype: string
- name: value
dtype: string
- name: action_option_index
dtype: int32
splits:
- name: train
num_bytes: 722
num_examples: 1
download_size: 1432409
dataset_size: 722
---
# BabyAGI (Dataset)
The initial demonstration dataset follows the Huggingface dataset spec, with the raw data split into two components, trajectory images and trajectory metadata. The metadata is stored in the raw dataset, and the images are stored on S3. The data is loaded using the dataloader defined in [baby_agi_dataset.py](./baby_agi_dataset.py).
**Data Layout:**
```plaintext
├── data
│ ├── metadata_0.json
│ ├── metadata_1.json
│ └── ...
├-- baby_agi_dataset.py
```
### Metadata Format (.json)
```json
[
{
"id": "<trajectory_id_hash>",
"instruction": "<some instruction>",
"trajectory": [
{
"image_id": "image_id",
"action_options": [
{
"index": 0,
"top_left": [120, 340],
"bottom_right": [140, 440],
},
...
],
"action_taken": {
"type": "click",
"value": "value (only for type and scroll)",
"action_option_index": 0
}
},
...
]
},
]
```
## Action Types
The dataset metadata includes three types of actions: "click", "type", and "scroll". The `action_option_index` field indicates the index of the clicked element within the `action_options` list.
1. **Click**: Represents a user clicking on an element.
2. **Type**: Represents a user typing into an input field.
3. **Scroll**: Represents a user scrolling the viewport. The `value` field indicates the direction of the scroll, with "up" corresponding to a 200px scroll upwards and "down" corresponding to a 200px scroll downwards. Note that `bottom_left` and `top_right` will always be zero-arrays for these.
## Dataset Generation Pipeline
The dataset is generated through the following steps:
1. **Load Demo**: The demo is loaded from the Hugging Face dataset.
2. **Load Trace**: The trace is loaded from the Globus dataset.
3. **Process Trajectories**: For each Mind2Web (M2W) trajectory:
a) **Map Actions**: M2W actions are mapped to Playwright trace actions using the timestamp in `dom_content.json`.
b) **Screenshot DOM**: The DOM is "screenshoted" just before the action.
c) **Map Candidates**: `pos_candidates` and `neg_candidates` from the M2W action metadata are mapped to HTML bounding boxes via class+id matching from the action metadata. New bounding box coordinates are obtained for each.
d) **Craft Meta + Screenshot Pair**: The pair of metadata and screenshots is crafted and saved/appended.
4. **Save Data**: The updated data directory is saved to S3 and Hugging Face.
### Screenshots
Screenshots in this dataset are generated from the before states of Mind2Web trajectory traces. Each image has a width of 2036 and a height of 1144. For alternate screen sizes (via augmentation), padding is added to maintain the aspect ratio. This ensures that the content of the screenshot remains consistent across different screen sizes.
### Options Generation
Options in this dataset are generated from `positive_candidates` (always one) and `negative_candidates` in the Mind2Web (M2W) dataset. The M2W dataset labels *all* possible interactions on the DOM. Therefore, the 50 largest area-wise options within the viewport containing the positive candidate are selected.
### Scrolling
The Mind2Web (M2W) dataset captures the entire DOM, so when the selected option action is not in the viewport, artificial scroll actions are created. This action has two possible values: "up" and "down". Each of which corresponds to a 200px scroll in the respective direction.
### Selecting
The "Select" action in the Mind2Web (M2W) dataset is recorded when a user makes a selection from a dropdown list. In this dataset, we represent it as a sequence of two distinct actions in a trajectory:
1. **Click**: The user clicks on the dropdown element.
2. **Type**: The user types the desired value followed by Enter
## Usage
To use the dataset in your Python program, you can load it using the `load_dataset` function from the `datasets` library:
```python
from datasets import load_dataset
# typically load_dataset("lukemann/baby-agi-dataset-v0"
dataset = load_dataset("lukemann/baby-agi-dataset-v0")
first_row = dataset['train'][0]
print(first_row)
```
This will load the dataset and print the first row of the training set.
For a short demo, refer to the [demo.py](./demo.py) file. |
Brizape/tmvar_tokenized_split_0404_dev | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: texts
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2897566
num_examples: 801
- name: validation
num_bytes: 723081
num_examples: 201
- name: test
num_bytes: 1746380
num_examples: 498
download_size: 1248054
dataset_size: 5367027
---
# Dataset Card for "tmvar_tokenized_split_0404_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RikoteMaster/goemotion_4_llama2 | ---
dataset_info:
features:
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
- name: Augmented
dtype: bool
splits:
- name: train
num_bytes: 12984427
num_examples: 36324
download_size: 4427087
dataset_size: 12984427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "goemotion_4_llama2_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HArman0007/bmv2 | ---
license: cc-by-4.0
---
|
wnut_17 | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: wnut-2017-emerging-and-rare-entity
pretty_name: WNUT 17
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-corporation
'2': I-corporation
'3': B-creative-work
'4': I-creative-work
'5': B-group
'6': I-group
'7': B-location
'8': I-location
'9': B-person
'10': I-person
'11': B-product
'12': I-product
config_name: wnut_17
splits:
- name: train
num_bytes: 1078379
num_examples: 3394
- name: validation
num_bytes: 259383
num_examples: 1009
- name: test
num_bytes: 405536
num_examples: 1287
download_size: 800955
dataset_size: 1743298
---
# Dataset Card for "wnut_17"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://noisy-text.github.io/2017/emerging-rare-entities.html](http://noisy-text.github.io/2017/emerging-rare-entities.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 0.80 MB
- **Size of the generated dataset:** 1.74 MB
- **Total amount of disk used:** 2.55 MB
### Dataset Summary
WNUT 17: Emerging and Rare entity recognition
This shared task focuses on identifying unusual, previously-unseen entities in the context of emerging discussions.
Named entities form the basis of many modern approaches to other tasks (like event clustering and summarisation),
but recall on them is a real problem in noisy text - even among annotators. This drop tends to be due to novel entities and surface forms.
Take for example the tweet “so.. kktny in 30 mins?” - even human experts find entity kktny hard to detect and resolve.
This task will evaluate the ability to detect and classify novel, emerging, singleton named entities in noisy text.
The goal of this task is to provide a definition of emerging and of rare entities, and based on that, also datasets for detecting these entities.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 0.80 MB
- **Size of the generated dataset:** 1.74 MB
- **Total amount of disk used:** 2.55 MB
An example of 'train' looks as follows.
```
{
"id": "0",
"ner_tags": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 7, 8, 8, 0, 7, 0, 0, 0, 0, 0, 0, 0, 0],
"tokens": ["@paulwalk", "It", "'s", "the", "view", "from", "where", "I", "'m", "living", "for", "two", "weeks", ".", "Empire", "State", "Building", "=", "ESB", ".", "Pretty", "bad", "storm", "here", "last", "evening", "."]
}
```
### Data Fields
The data fields are the same among all splits:
- `id` (`string`): ID of the example.
- `tokens` (`list` of `string`): Tokens of the example text.
- `ner_tags` (`list` of class labels): NER tags of the tokens (using IOB2 format), with possible values:
- 0: `O`
- 1: `B-corporation`
- 2: `I-corporation`
- 3: `B-creative-work`
- 4: `I-creative-work`
- 5: `B-group`
- 6: `I-group`
- 7: `B-location`
- 8: `I-location`
- 9: `B-person`
- 10: `I-person`
- 11: `B-product`
- 12: `I-product`
### Data Splits
|train|validation|test|
|----:|---------:|---:|
| 3394| 1009|1287|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{derczynski-etal-2017-results,
title = "Results of the {WNUT}2017 Shared Task on Novel and Emerging Entity Recognition",
author = "Derczynski, Leon and
Nichols, Eric and
van Erp, Marieke and
Limsopatham, Nut",
booktitle = "Proceedings of the 3rd Workshop on Noisy User-generated Text",
month = sep,
year = "2017",
address = "Copenhagen, Denmark",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/W17-4418",
doi = "10.18653/v1/W17-4418",
pages = "140--147",
abstract = "This shared task focuses on identifying unusual, previously-unseen entities in the context of emerging discussions.
Named entities form the basis of many modern approaches to other tasks (like event clustering and summarization),
but recall on them is a real problem in noisy text - even among annotators.
This drop tends to be due to novel entities and surface forms.
Take for example the tweet {``}so.. kktny in 30 mins?!{''} {--} even human experts find the entity {`}kktny{'}
hard to detect and resolve. The goal of this task is to provide a definition of emerging and of rare entities,
and based on that, also datasets for detecting these entities. The task as described in this paper evaluated the
ability of participating entries to detect and classify novel and emerging named entities in noisy text.",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@stefan-it](https://github.com/stefan-it), [@lewtun](https://github.com/lewtun), [@jplu](https://github.com/jplu) for adding this dataset. |
open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0 | ---
pretty_name: Evaluation run of ZoidBB/Jovian-10.7B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZoidBB/Jovian-10.7B-v1.0](https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T22:02:39.167169](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0/blob/main/results_2024-01-05T22-02-39.167169.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6580437344612732,\n\
\ \"acc_stderr\": 0.03198478309112164,\n \"acc_norm\": 0.6603650951482142,\n\
\ \"acc_norm_stderr\": 0.03262928239402026,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.5200231220903274,\n\
\ \"mc2_stderr\": 0.015200059344934734\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693249\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6752638916550487,\n\
\ \"acc_stderr\": 0.00467319142386121,\n \"acc_norm\": 0.8639713204540929,\n\
\ \"acc_norm_stderr\": 0.0034211839093201612\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771693,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771693\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n\
\ \"acc_stderr\": 0.012764981829524277,\n \"acc_norm\": 0.48565840938722293,\n\
\ \"acc_norm_stderr\": 0.012764981829524277\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.02747227447323381,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.02747227447323381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.5200231220903274,\n\
\ \"mc2_stderr\": 0.015200059344934734\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.01083327651500749\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \
\ \"acc_stderr\": 0.013627322286986815\n }\n}\n```"
repo_url: https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|arc:challenge|25_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|arc:challenge|25_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|gsm8k|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|gsm8k|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hellaswag|10_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hellaswag|10_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T11-28-30.707086.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T22-02-39.167169.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- '**/details_harness|winogrande|5_2024-01-05T11-28-30.707086.parquet'
- split: 2024_01_05T22_02_39.167169
path:
- '**/details_harness|winogrande|5_2024-01-05T22-02-39.167169.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T22-02-39.167169.parquet'
- config_name: results
data_files:
- split: 2024_01_05T11_28_30.707086
path:
- results_2024-01-05T11-28-30.707086.parquet
- split: 2024_01_05T22_02_39.167169
path:
- results_2024-01-05T22-02-39.167169.parquet
- split: latest
path:
- results_2024-01-05T22-02-39.167169.parquet
---
# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZoidBB/Jovian-10.7B-v1.0](https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T22:02:39.167169](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0/blob/main/results_2024-01-05T22-02-39.167169.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6580437344612732,
"acc_stderr": 0.03198478309112164,
"acc_norm": 0.6603650951482142,
"acc_norm_stderr": 0.03262928239402026,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.5200231220903274,
"mc2_stderr": 0.015200059344934734
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382501,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693249
},
"harness|hellaswag|10": {
"acc": 0.6752638916550487,
"acc_stderr": 0.00467319142386121,
"acc_norm": 0.8639713204540929,
"acc_norm_stderr": 0.0034211839093201612
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771693,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48565840938722293,
"acc_stderr": 0.012764981829524277,
"acc_norm": 0.48565840938722293,
"acc_norm_stderr": 0.012764981829524277
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.02747227447323381,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.02747227447323381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.5200231220903274,
"mc2_stderr": 0.015200059344934734
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.01083327651500749
},
"harness|gsm8k|5": {
"acc": 0.5724033358605004,
"acc_stderr": 0.013627322286986815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge | ---
pretty_name: Evaluation run of abhishekchohan/mistral-7B-med-merge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishekchohan/mistral-7B-med-merge](https://huggingface.co/abhishekchohan/mistral-7B-med-merge)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T23:55:53.444793](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge/blob/main/results_2024-01-21T23-55-53.444793.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5813577401888572,\n\
\ \"acc_stderr\": 0.03392117164078901,\n \"acc_norm\": 0.5837504497394033,\n\
\ \"acc_norm_stderr\": 0.03462209771980794,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.536535823977723,\n\
\ \"mc2_stderr\": 0.015589392373408393\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257187,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094089\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6461860187213703,\n\
\ \"acc_stderr\": 0.004771751187407019,\n \"acc_norm\": 0.8296156144194383,\n\
\ \"acc_norm_stderr\": 0.0037520176390837515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"\
acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137503,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137503\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647553,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647553\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033543,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.016740929047162702,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.016740929047162702\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388866,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647012,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038245,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038245\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347976,\n \"mc2\": 0.536535823977723,\n\
\ \"mc2_stderr\": 0.015589392373408393\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4495830174374526,\n \
\ \"acc_stderr\": 0.013702290047884738\n }\n}\n```"
repo_url: https://huggingface.co/abhishekchohan/mistral-7B-med-merge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|arc:challenge|25_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|gsm8k|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hellaswag|10_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T23-55-53.444793.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- '**/details_harness|winogrande|5_2024-01-21T23-55-53.444793.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T23-55-53.444793.parquet'
- config_name: results
data_files:
- split: 2024_01_21T23_55_53.444793
path:
- results_2024-01-21T23-55-53.444793.parquet
- split: latest
path:
- results_2024-01-21T23-55-53.444793.parquet
---
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-med-merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-med-merge](https://huggingface.co/abhishekchohan/mistral-7B-med-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T23:55:53.444793](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-med-merge/blob/main/results_2024-01-21T23-55-53.444793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5813577401888572,
"acc_stderr": 0.03392117164078901,
"acc_norm": 0.5837504497394033,
"acc_norm_stderr": 0.03462209771980794,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.536535823977723,
"mc2_stderr": 0.015589392373408393
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257187,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094089
},
"harness|hellaswag|10": {
"acc": 0.6461860187213703,
"acc_stderr": 0.004771751187407019,
"acc_norm": 0.8296156144194383,
"acc_norm_stderr": 0.0037520176390837515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137503,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137503
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033543,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162702,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162702
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388866,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647012,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038245,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038245
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347976,
"mc2": 0.536535823977723,
"mc2_stderr": 0.015589392373408393
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.4495830174374526,
"acc_stderr": 0.013702290047884738
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dkshjn/mixqa_cot_2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 14631
num_examples: 19
download_size: 16262
dataset_size: 14631
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mixqa_cot_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sowmya15/Profanity_22 | ---
license: apache-2.0
---
|
CyberHarem/am_ksg_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of am_ksg/AmKSG/KSG (Girls' Frontline)
This is the dataset of am_ksg/AmKSG/KSG (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `bangs, short_hair, sunglasses, white_hair, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 15.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 15.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 13.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 23.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/am_ksg_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, jacket, solo, fingerless_gloves, hood_up, gun, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | solo | fingerless_gloves | hood_up | gun | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------------------|:----------|:------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X |
|
tyzhu/lmind_hotpot_train8000_eval7405_v1_doc_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 1380987
num_examples: 8000
- name: train_recite_qa
num_bytes: 8547861
num_examples: 8000
- name: train_ic_qa
num_bytes: 8539861
num_examples: 8000
- name: eval_qa
num_bytes: 1201450
num_examples: 7405
- name: eval_recite_qa
num_bytes: 7941487
num_examples: 7405
- name: eval_ic_qa
num_bytes: 7934082
num_examples: 7405
- name: all_docs
num_bytes: 12508009
num_examples: 26854
- name: all_docs_eval
num_bytes: 12506219
num_examples: 26854
- name: train
num_bytes: 13888996
num_examples: 34854
- name: validation
num_bytes: 1201450
num_examples: 7405
download_size: 0
dataset_size: 75650402
---
# Dataset Card for "lmind_hotpot_train8000_eval7405_v1_doc_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RaviNaik/RedPajamaTokenized | ---
license: mit
---
|
cdleong/temp_africaNLP_keyword_spotting_for_african_languages | ---
language:
- wo
- fuc
- srr
- mnk
- snk
---
## Dataset Description
- **Homepage:** https://zenodo.org/record/4661645
TEMPORARY TEST DATASET
Not for actual use! Attempting to test out a dataset script for loading https://zenodo.org/record/4661645
|
SEACrowd/titml_idn | ---
tags:
- speech-recognition
language:
- ind
---
# titml_idn
TITML-IDN (Tokyo Institute of Technology Multilingual - Indonesian) is collected to build a pioneering Indonesian Large Vocabulary Continuous Speech Recognition (LVCSR) System. In order to build an LVCSR system, high accurate acoustic models and large-scale language models are essential. Since Indonesian speech corpus was not available yet, we tried to collect speech data from 20 Indonesian native speakers (11 males and 9 females) to construct a speech corpus for training the acoustic model based on Hidden Markov Models (HMMs). A text corpus which was collected by ILPS, Informatics Institute, University of Amsterdam, was used to build a 40K-vocabulary dictionary and a n-gram language model.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{lestari2006titmlidn,
title={A large vocabulary continuous speech recognition system for Indonesian language},
author={Lestari, Dessi Puji and Iwano, Koji and Furui, Sadaoki},
booktitle={15th Indonesian Scientific Conference in Japan Proceedings},
pages={17--22},
year={2006}
}
```
## License
For research purposes only. If you use this corpus, you have to cite (Lestari et al, 2006).
## Homepage
[http://research.nii.ac.jp/src/en/TITML-IDN.html](http://research.nii.ac.jp/src/en/TITML-IDN.html)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Maiia/skillspan_job_ner_without_cap | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: labels
sequence:
class_label:
names:
'0': B-Skill I-Knowledge
'1': I-Skill B-Knowledge
'2': B-Knowledge
'3': I-Skill I-Knowledge
'4': I-Skill
'5': B-Skill
'6': I-Knowledge
'7': O
'8': -100
splits:
- name: train
num_bytes: 2952664
num_examples: 8005
- name: test
num_bytes: 1001832
num_examples: 3565
download_size: 523562
dataset_size: 3954496
---
# Dataset Card for "skillspan_job_ner_without_cap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ofua/artisanal_with_responses | ---
dataset_info:
features:
- name: dataset_id
dtype: string
- name: original_text
dtype: string
- name: rewritten_text
dtype: string
- name: prompt
dtype: string
- name: tune_response
dtype: string
splits:
- name: train
num_bytes: 17859686
num_examples: 11648
download_size: 11483886
dataset_size: 17859686
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ShoumikN/guanaco-llama2-500 | ---
license: mit
---
|
kaleemWaheed/twitter_dataset_1713163328 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22758
num_examples: 53
download_size: 11461
dataset_size: 22758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hack90/ncbi_genbank_part_63 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 12920615864
num_examples: 13999730
download_size: 5041172591
dataset_size: 12920615864
---
# Dataset Card for "ncbi_genbank_part_63"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/all-nli-NOB | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 142671221
num_examples: 942854
download_size: 67856445
dataset_size: 142671221
task_categories:
- sentence-similarity
language:
- nb
- 'no'
license: cc-by-4.0
---
# Dataset Card for "all-nli-NOB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nesboen/Style-Marc-Allante | ---
license: afl-3.0
---
|
uatafaque/rafa | ---
license: openrail
---
|
HealthTeam/350k_dataset_health_ar_en_th_tokenized | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 744995253
num_examples: 1075552
- name: validation
num_bytes: 26752860
num_examples: 40432
- name: test
num_bytes: 26752860
num_examples: 40432
download_size: 294054766
dataset_size: 798500973
---
# Dataset Card for "350k_dataset_health_ar_en_th_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_239 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1488600372
num_examples: 292341
download_size: 1519686081
dataset_size: 1488600372
---
# Dataset Card for "chunk_239"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_14 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1068215036.0
num_examples: 209783
download_size: 1085666582
dataset_size: 1068215036.0
---
# Dataset Card for "chunk_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-59000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1016584
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zekeZZ/hh-rlhf-dpo | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 38071901
num_examples: 49391
- name: test
num_bytes: 989565
num_examples: 2466
download_size: 24318977
dataset_size: 39061466
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [simonveitner/MathHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T18:25:11.977949](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T18-25-11.977949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6357052985064087,\n\
\ \"acc_stderr\": 0.03227227710982547,\n \"acc_norm\": 0.6396287253937496,\n\
\ \"acc_norm_stderr\": 0.032910368232277956,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.519509607840464,\n\
\ \"mc2_stderr\": 0.015313445088017108\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.01426963463567073,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598677\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\
\ \"acc_stderr\": 0.004752158936871871,\n \"acc_norm\": 0.8418641704839673,\n\
\ \"acc_norm_stderr\": 0.0036412262941678012\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.01531825774597671,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.01531825774597671\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.519509607840464,\n\
\ \"mc2_stderr\": 0.015313445088017108\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205191\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4927975739196361,\n \
\ \"acc_stderr\": 0.013771055751972868\n }\n}\n```"
repo_url: https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-25-11.977949.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- '**/details_harness|winogrande|5_2023-12-04T18-25-11.977949.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T18-25-11.977949.parquet'
- config_name: results
data_files:
- split: 2023_12_04T18_25_11.977949
path:
- results_2023-12-04T18-25-11.977949.parquet
- split: latest
path:
- results_2023-12-04T18-25-11.977949.parquet
---
# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [simonveitner/MathHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:25:11.977949](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T18-25-11.977949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6357052985064087,
"acc_stderr": 0.03227227710982547,
"acc_norm": 0.6396287253937496,
"acc_norm_stderr": 0.032910368232277956,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.519509607840464,
"mc2_stderr": 0.015313445088017108
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.01426963463567073,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598677
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871871,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.0036412262941678012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.01531825774597671,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.01531825774597671
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.519509607840464,
"mc2_stderr": 0.015313445088017108
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205191
},
"harness|gsm8k|5": {
"acc": 0.4927975739196361,
"acc_stderr": 0.013771055751972868
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mayflowergmbh/dolphin_de | ---
task_categories:
- text-generation
language:
- de
---
A german translation for the [cognitivecomputations/dolphin](https://huggingface.co/datasets/cognitivecomputations/dolphin) dataset.
Extracted from [seedboxventures/multitask_german_examples_32k](https://huggingface.co/datasets/seedboxventures/multitask_german_examples_32k).
Translation created by [seedbox ai](https://huggingface.co/seedboxai) for [KafkaLM](https://huggingface.co/seedboxai/KafkaLM-70B-German-V0.1) ❤️.
Available for finetuning in [hiyouga/LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory). |
Loewolf/L-GPT_dataset | ---
license: mit
---
|
SWLLMS/sum_dataset_TK0_480 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 44557837.6
num_examples: 384
- name: test
num_bytes: 11139459.4
num_examples: 96
download_size: 12972747
dataset_size: 55697297.0
---
# Dataset Card for "sum_dataset_TK0_480"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alobaidizt/ORBC_embeddings | ---
license: mit
---
|
Flavinhouaua2022/jorge | ---
license: openrail
---
|
biadrivex/gojo | ---
license: openrail
---
|
Wisesofi/Storage.google | ---
license: llama2
---
import transformers
import datasets
import streamlit as st
# Load the DNA structures dataset
dataset = datasets.load_dataset("protein_structure", data_dir="dataset_archive(1).zip")
# Load the AlphaFold model
model = transformers.AutoModelForSequenceClassification.from_pretrained("alphafold")
# Create a function to predict the structure of a protein
def predict_structure(sequence):
prediction = model(sequence.unsqueeze(0))[0]
return prediction
# Create a Streamlit app
st.title("DNA Structures Research")
# Input the DNA sequence
sequence = st.text_input("DNA Sequence")
# Predict the structure of the protein
prediction = predict_structure(sequence)
# Display the prediction
st.write(prediction)
|
H4438/tri-edu-date | ---
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: dates
sequence: string
- name: body
dtype: string
- name: head
dtype: string
- name: est_date
dtype: string
- name: ext_dates
sequence: string
- name: flt_dates
sequence: string
splits:
- name: train
num_bytes: 214580613
num_examples: 37239
download_size: 0
dataset_size: 214580613
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tri-edu-date"
Left: 3429 rows - 0.09%
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mpingale/amsProject | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 53628
num_examples: 157
download_size: 32020
dataset_size: 53628
---
# Dataset Card for "amsProject"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wonsangkim/dreambooth-hackathon-images-jindo | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3620773.0
num_examples: 20
download_size: 3618987
dataset_size: 3620773.0
---
# Dataset Card for "dreambooth-hackathon-images-jindo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distantquant/worded-math | ---
license: cc-by-4.0
language:
- en
pretty_name: Worded Math
size_categories:
- 100K<n<1M
---
## Worded Math
- Version 1.1
- Updated for general improvements
1 Million examples of word-based math (in English) with number results.
Created from [fglbit/simple-math](https://huggingface.co/datasets/fblgit/simple-math).
Using this python code (requires `inflect`):
```py
import random
import inflect
import math
inf = inflect.engine()
# Define the number of samples you want to generate
num_samples = 1000000
# Define the range for the random numbers
min_value = -999.9
max_value = 999.9
# Define the arithmetic operations
operations = ["+", "-", "*", "/"]
div = ["divided by", "divided into"]
plus = ["plus", "added to"]
minus = ["minus", "subtracted from"]
times = ["times", "multiplied by"]
splitted = num_samples/5
# Generate data
train_data = []
for i in range(num_samples):
# Limit max num1,num2 to -+9,999,999
if i > 100000:
ti = 100000
else:
ti = i
multfactor = math.trunc((ti/10)+1)
# Randomly sometimes float randomly don't
if i % 2 == 0 and i >= splitted:
num1 = float("%.3f" % random.uniform(min_value*multfactor, max_value*multfactor))
num2 = float("%.3f" % random.uniform(min_value*multfactor, max_value*multfactor))
else:
num1 = math.trunc(random.uniform(min_value*multfactor, max_value*multfactor))
num2 = math.trunc(random.uniform(min_value*multfactor, max_value*multfactor))
while num2 == 0.0:
num2 = float("%.3f" % random.uniform(min_value, max_value))
while num1 == 0.0:
num1 = float("%.3f" % random.uniform(min_value, max_value))
operation = random.choice(operations)
if operation == "/":
result = num1 / num2
opp = random.choice(div)
elif operation == '-':
result = num1 - num2
opp = random.choice(minus)
elif operation == '*':
result = num1 * num2
opp = random.choice(times)
elif operation == '+':
result = num1 + num2
opp = random.choice(plus)
output = round(result, 4)
num1 = inf.number_to_words(num1)
num2 = inf.number_to_words(num2)
if random.randint(0, 1) == 1:
output = inf.number_to_words(output)
else:
output = str(output)
instruction = f"{num1} {opp} {num2}"
train_data.append({'instruction': instruction, 'output': output})
# Create the dataset
import json
# Output test data
test_data = []
to_pop = []
for re in range(num_samples):
if re % 40 == 0:
if (re/40) % 2 == 0:
test_data.append(train_data[re])
to_pop.append(re)
else:
test_data.append(train_data[re-1])
to_pop.append(re-1)
# Pop test data from train data
popi = 0
for pop in to_pop:
train_data.pop(pop-popi)
popi += 1
# Output test data
test_out_file = 'worded-math-test-v1.1.json'
with open(test_out_file, 'w') as f:
json.dump(test_data, f)
# Output train data
train_out_file = 'worded-math-train-v1.1.json'
with open(train_out_file, 'w') as f:
json.dump(train_data, f)
``` |
arafatar/details_harness_drop | ---
license: unknown
---
|
AdapterOcean/code_instructions_standardized_cluster_10 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 54794053
num_examples: 5757
download_size: 15178326
dataset_size: 54794053
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keremberke/license-plate-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
- Self Driving
- Anpr
---
<div align="center">
<img width="640" alt="keremberke/license-plate-object-detection" src="https://huggingface.co/datasets/keremberke/license-plate-object-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['license_plate']
```
### Number of Images
```json
{'train': 6176, 'valid': 1765, 'test': 882}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/license-plate-object-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/augmented-startups/vehicle-registration-plates-trudk/dataset/1](https://universe.roboflow.com/augmented-startups/vehicle-registration-plates-trudk/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ vehicle-registration-plates-trudk_dataset,
title = { Vehicle Registration Plates Dataset },
type = { Open Source Dataset },
author = { Augmented Startups },
howpublished = { \\url{ https://universe.roboflow.com/augmented-startups/vehicle-registration-plates-trudk } },
url = { https://universe.roboflow.com/augmented-startups/vehicle-registration-plates-trudk },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { jun },
note = { visited on 2023-01-18 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.ai on January 13, 2022 at 5:20 PM GMT
It includes 8823 images.
VRP are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
No image augmentation techniques were applied.
|
laion/laion2B-en-md5 | Invalid username or password. |
llm-aes/dataset_hanna_96_prompts_llm_eval | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
splits:
- name: train
num_bytes: 1101138
num_examples: 5280
download_size: 109316
dataset_size: 1101138
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HCSA/Forest_Plot_Data_2023 | ---
license: cc-by-sa-4.0
---
# Dataset Card for HCSA Forest Plot Data 2023
• Title: High Carbon Stock Approach (HCSA) Forest Plot Data 2023
• Description: This dataset contains information of forest field plot inventory data collected using the High Carbon Stock Approach (HCSA) methodology. This data was collected to provide validation and training data for large scale indicative HCS forest maps produced with the HCSA Largescale Mapping Framework (https://highcarbonstock.org/wp-content/uploads/2023/02/HCSA-Large-Scale-MAP-FWK-Procedure-1.pdf) under a project funded by the GIZ Fair Forward Initiative
• The data includes various parameters related to land cover, carbon content, tree characteristics, and biomass calculations.

## Dataset Details
• Region: The province in Indonesia where the data was collected. <br>
• X Coordinate: The horizontal position of the plot – WGS 84 <br>
• Y Coordinate: The vertical position of the plot – WGS 84 <br>
• Land Cover Update: Updated HCS forest class from plot Carbon stock <br>
• Land Cover Indicatives: Indicative HCS forest class from Lang et al., 2021 https://arxiv.org/abs/2107.07431<br>
• Carbon (ton/ha): Carbon content per hectare in the forest field plot. <br>
• Plot Area (Ha): The total area covered by the forest field plot in hectares (See method figure) <br>
• Tree Number: ID of the individual trees in the plot <br>
• DBH (cm): Diameter at Breast Height, measured in centimeters. <br>
• Height (cm): The height of the trees measured in centimeters. <br>
• Height (m): The height of the trees converted to meters. <br>
• Local Name: Common name or local name of the tree species. <br>
• Scientific Name: The scientific name of the tree species. <br>
• Family: Taxonomic family to which the tree species belongs. <br>
• Wood Density (g/cm3): The density of wood in grams per cubic centimeter - (Ketterings et. al, 2001 https://apps.worldagroforestry.org/sea/Products/AFDbases/WD/Index.htm <br>
• Biomass (kg)/pohon: Biomass per tree measured in kilograms (Chave et. al, 2014 https://doi.org/10.1111/gcb.12629). <br>
• Biomass (ton)/pohon: Biomass per tree converted to metric tons. <br>
• Biomass (ton/ha): Total biomass per hectare calculated from the field plot.<br>
### Dataset Description
• High Carbon Stock Approach (HCSA): The dataset contains forest field plot data collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (https://highcarbonstock.org/wp-content/uploads/2017/09/HCSA-Toolkit-v2.0-Module-4-Forest-and-vegetation-stratification-190917-web.pdf).
- **Curated by:** High Carbon Stock Approach
- **Funded by :** Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ)
- **Language(s) (NLP):** English, Bahasa Indonesia
- **License:** cc-by-sa-4.0
### Dataset Sources
- **Repository:** High Carbon Stock Approach - ArcGIS Online -
## Uses
• Purpose: The dataset is intended for research and analysis related to forest ecology, carbon sequestration, and biodiversity.
• Citation: If used in publications or research, please cite the dataset as follows: High Carbon Stock Approach (2023). Forest Field Plot Data for Indicative HCS Mapping, Indonesia.
### Direct Use
This data is intended to support identification of indicative HCS forests for HCSA Landscape and Jurisdictional approach implementation, for Smallholder Approaches, or a preliminary step in an HCS assessment process.
### Out-of-Scope Use
This dataset is not to be used for: Sale of carbon credits on community lands without proper FPIC, consent, and mapping exercises.
Indicative HCS forest maps are probability maps and do not represent HCS Assessments or final maps from implementation of the HCS Landscape and Jurisdicional Approaches methodologies.
## Dataset Structure
This dataset contains a .csv file of field plot data
### Curation Rationale
The High Carbon Stock Approach (HCSA) core mission is to halt deforestation resulting from commodity production. It is a tool that identifies and conserves natural forests from degraded lands in tropical landscapes, amplifying the role of forest conservation as a nature-based solution to climate change, while at the same time supporting biodiversity conservation, community rights and benefits, and responsible development. To understand the distribution of High Carbon Stock (HCS) forests throughout the landscape and work to protect them in collaboration with smallholder farmers, communities, as well as local and national governments, it is necessary to accurately map these resources in a regional and national scale. Field plot data is an essential step in landscape and jurisdictional implementation of HCS for commodities including Palm Oil, Rubber, and Cocoa. Developing large scale (country to continent wide) indicative HCS forest maps is valuable for several reasons:
<br>First, the identification and classification of HCS forests plays a major role in prioritizing and guiding conservation efforts. Not all forests have the same carbon stock potential. Carbon stock-based forest classification directs attention to high carbon levels, which have the potential to make a contribution to climate change mitigation when conserved.
<br>Second, data collection serves as a valuable source of knowledge to inform the decision-making process. This information contains sustainable land use management and spatial planning in land use prior to land status confirmation, environmental policy formulation, and promoting green economy strategies. The data directs details of carbon stocks, biological diversity, and socio-cultural impacts on the area, and facilitates data-driven policies for digital transformation.
<br>Third, the focus on HCS forests synergizes with Indonesia's commitment to reduce the emission rate of greenhouse gases, as in strategic plans such as the RPJMN and PRJPP. This commitment is in line with the common goal of the Paris Agreement.
#### Data Collection and Processing
• High Carbon Stock Approach (HCSA): The dataset is collected following the HCSA methodology, a widely recognized approach for assessing and managing forest carbon stocks. A more detailed description of HCSA forest inventory methods can be found in the HCSA Toolkit Module 4 (https://highcarbonstock.org/wp-content/uploads/2017/09/HCSA-Toolkit-v2.0-Module-4-Forest-and-vegetation-stratification-190917-web.pdf).
• Data Collection Methods: Data was collected in partnership with JKPP (https://jkpp.org/), and Indonesian civil society organization focusing on community land rights and mapping. Field Plot locations were identified in advance using GIS software. Transect start points are normally located at convenient positions along roads, rivers, canals or other access routes. The distance between plots is generally dictated by the scale of the study area. Where large forest areas are being sampled and inventory planners seek broader coverage, this distance will be increased. The distance between plots is usually either 75 m or 100 m, but there is no fixed rule (Figure 2)
The same kind of plot is used for random, systematic and transect sampling. The recommended sample plot design is two concentric circles from a centre point with a total area of 500 m 2 or 0.05 ha. Circular plots are preferred to rectangular plots because they minimize the potential for error caused by slope factors and physical obstacles that may skew plot boundary lines.
The focus of vegetation measurement is on large plant species, which usually comprise the large majority of AGB. Other forest carbon pools are not measured because they are either relatively small in size (e.g. forest understory) and do not store much carbon, or are difficult and expensive to assess (e.g. below-ground biomass, deadwood and soil organic matter). Large plant species are defined as those having a diameter at breast height (DBH) greater than or equal to 5 cm. This includes both tree and non-tree species. Breast height for the DBH measurement is defined as 1.3 metres.
Large plant species (referred to as ‘trees’ for simplicity, but also including non-tree species such as some palms) are measured using the following steps:
<br>1. Identification of ‘in’ trees: A tree is defined as an ‘in’ tree if the centre of its stem at DBH is within the boundaries of the plot. Trees on the edge of the plot (borderline trees) must be checked using a nylon rope marked at the correct plot radii (see Figure 12).
<br>2. Flagging tape: Each tree is labelled with flagging tape. The label must indicate the tree number as recorded in the field book.
<br>3. DBH measurement: All trees greater or equal to 15 cm (forest inventory plot) DBH shall be measured in the large plot. In addition to the large trees, all trees greater than or equal to 5 cm (forest inventory plot) and less than 15 cm (forest inventory plot) DBH shall be measured in the small plot (see Figure 13).
<br>4. Height measurement: Depending on the eventual allometric equation used, it may also be necessary to measure total tree heights. Tree heights were measured with the Nikon Forestry Pro II Laser Rangefinder/Hypsometer. These calculate height automatically based on readings taken to the top and bottom of the tree, plus, in some cases, a reading of horizontal distance. Once the user is familiar with their mode of operation, these meters are practical to use and measurements can be carried out by one person (usually the team leader). Height measurement with clinometers is also possible but tends to be slow and more prone to error. Where allometrics require an estimate of total tree height, there are two options for generation of height data: measuring a subset of trees and then deriving a diameter-tree height regression from the measured trees, or direct measurement of all trees.
<br>5. Species: All trees measured in the plot must be identified to genus level and preferably to species level. This information is needed in the allometric equation, and to be able to describe forest composition and structure in a general way. As stated previously, botanists should be part of the field team; local names can be noted in the field book and translated to species names later on. If a genus cannot be identified, photographs and botanical samples must be collected and marked so that experts can identify them later.
Data processing involves using lookup tables to determine specific tree wood density and using the allometric equation from Chave et al (2014) to convert to biomass, and from biomass/plot to biomass/ha (see Chave, Jérôme, et al. "Improved allometric models to estimate the aboveground biomass of tropical trees." Global change biology 20.10 (2014): 3177-3190.)


#### Who are the source data producers?
This data was collected by field teams from JKPP (https://jkpp.org/). Field teams consisted of a team lead, a botanist, and usually 3-5 other members trained in the use of forest measurement tools and methods by JKPP. Teams were not composed of professional foresters but did attend a training in collection techniques before going into the field.
#### Personal and Sensitive Information
This data does not contain personally identifiable information
### Contribute to this dataset
Additional field plot data collected using this methodology can be done using the High Carbon Stock Approach Field Data Collection forms in the ODK app.
## Dataset Card Contact
info@highcarbonstock.org |
liuyanchen1015/MULTI_VALUE_rte_null_prepositions | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 905029
num_examples: 2871
- name: train
num_bytes: 782979
num_examples: 2390
download_size: 1095383
dataset_size: 1688008
---
# Dataset Card for "MULTI_VALUE_rte_null_prepositions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imppres | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- en
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
paperswithcode_id: imppres
pretty_name: IMPPRES
dataset_info:
- config_name: implicature_connectives
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: connectives
num_bytes: 221844
num_examples: 1200
download_size: 25478
dataset_size: 221844
- config_name: implicature_gradable_adjective
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: gradable_adjective
num_bytes: 153648
num_examples: 1200
download_size: 17337
dataset_size: 153648
- config_name: implicature_gradable_verb
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: gradable_verb
num_bytes: 180678
num_examples: 1200
download_size: 21504
dataset_size: 180678
- config_name: implicature_modals
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: modals
num_bytes: 178536
num_examples: 1200
download_size: 21179
dataset_size: 178536
- config_name: implicature_numerals_10_100
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: numerals_10_100
num_bytes: 208596
num_examples: 1200
download_size: 22640
dataset_size: 208596
- config_name: implicature_numerals_2_3
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: numerals_2_3
num_bytes: 188760
num_examples: 1200
download_size: 22218
dataset_size: 188760
- config_name: implicature_quantifiers
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: gold_label_log
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: gold_label_prag
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: spec_relation
dtype: string
- name: item_type
dtype: string
- name: trigger
dtype: string
- name: lexemes
dtype: string
splits:
- name: quantifiers
num_bytes: 176790
num_examples: 1200
download_size: 21017
dataset_size: 176790
- config_name: presupposition_all_n_presupposition
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: all_n_presupposition
num_bytes: 458460
num_examples: 1900
download_size: 43038
dataset_size: 458460
- config_name: presupposition_both_presupposition
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: both_presupposition
num_bytes: 432760
num_examples: 1900
download_size: 41142
dataset_size: 432760
- config_name: presupposition_change_of_state
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: change_of_state
num_bytes: 308595
num_examples: 1900
download_size: 35814
dataset_size: 308595
- config_name: presupposition_cleft_existence
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: cleft_existence
num_bytes: 363206
num_examples: 1900
download_size: 37597
dataset_size: 363206
- config_name: presupposition_cleft_uniqueness
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: cleft_uniqueness
num_bytes: 388747
num_examples: 1900
download_size: 38279
dataset_size: 388747
- config_name: presupposition_only_presupposition
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: only_presupposition
num_bytes: 348986
num_examples: 1900
download_size: 38126
dataset_size: 348986
- config_name: presupposition_possessed_definites_existence
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: possessed_definites_existence
num_bytes: 362302
num_examples: 1900
download_size: 38712
dataset_size: 362302
- config_name: presupposition_possessed_definites_uniqueness
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: possessed_definites_uniqueness
num_bytes: 459371
num_examples: 1900
download_size: 42068
dataset_size: 459371
- config_name: presupposition_question_presupposition
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: trigger
dtype: string
- name: trigger1
dtype: string
- name: trigger2
dtype: string
- name: presupposition
dtype: string
- name: gold_label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: UID
dtype: string
- name: pairID
dtype: string
- name: paradigmID
dtype: int16
splits:
- name: question_presupposition
num_bytes: 397195
num_examples: 1900
download_size: 41247
dataset_size: 397195
configs:
- config_name: implicature_connectives
data_files:
- split: connectives
path: implicature_connectives/connectives-*
- config_name: implicature_gradable_adjective
data_files:
- split: gradable_adjective
path: implicature_gradable_adjective/gradable_adjective-*
- config_name: implicature_gradable_verb
data_files:
- split: gradable_verb
path: implicature_gradable_verb/gradable_verb-*
- config_name: implicature_modals
data_files:
- split: modals
path: implicature_modals/modals-*
- config_name: implicature_numerals_10_100
data_files:
- split: numerals_10_100
path: implicature_numerals_10_100/numerals_10_100-*
- config_name: implicature_numerals_2_3
data_files:
- split: numerals_2_3
path: implicature_numerals_2_3/numerals_2_3-*
- config_name: implicature_quantifiers
data_files:
- split: quantifiers
path: implicature_quantifiers/quantifiers-*
- config_name: presupposition_all_n_presupposition
data_files:
- split: all_n_presupposition
path: presupposition_all_n_presupposition/all_n_presupposition-*
- config_name: presupposition_both_presupposition
data_files:
- split: both_presupposition
path: presupposition_both_presupposition/both_presupposition-*
- config_name: presupposition_change_of_state
data_files:
- split: change_of_state
path: presupposition_change_of_state/change_of_state-*
- config_name: presupposition_cleft_existence
data_files:
- split: cleft_existence
path: presupposition_cleft_existence/cleft_existence-*
- config_name: presupposition_cleft_uniqueness
data_files:
- split: cleft_uniqueness
path: presupposition_cleft_uniqueness/cleft_uniqueness-*
- config_name: presupposition_only_presupposition
data_files:
- split: only_presupposition
path: presupposition_only_presupposition/only_presupposition-*
- config_name: presupposition_possessed_definites_existence
data_files:
- split: possessed_definites_existence
path: presupposition_possessed_definites_existence/possessed_definites_existence-*
- config_name: presupposition_possessed_definites_uniqueness
data_files:
- split: possessed_definites_uniqueness
path: presupposition_possessed_definites_uniqueness/possessed_definites_uniqueness-*
- config_name: presupposition_question_presupposition
data_files:
- split: question_presupposition
path: presupposition_question_presupposition/question_presupposition-*
---
# Dataset Card for IMPPRES
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/facebookresearch/Imppres)
- **Repository:** [Github](https://github.com/facebookresearch/Imppres)
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/2020.acl-main.768)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Over >25k semiautomatically generated sentence pairs illustrating well-studied pragmatic inference types. IMPPRES is an NLI dataset following the format of SNLI (Bowman et al., 2015), MultiNLI (Williams et al., 2018) and XNLI (Conneau et al., 2018), which was created to evaluate how well trained NLI models recognize several classes of presuppositions and scalar implicatures.
### Supported Tasks and Leaderboards
Natural Language Inference.
### Languages
English.
## Dataset Structure
### Data Instances
The data consists of 2 configurations: implicature and presupposition.
Each configuration consists of several different sub-datasets:
**Pressupposition**
- all_n_presupposition
- change_of_state
- cleft_uniqueness
- possessed_definites_existence
- question_presupposition
- both_presupposition
- cleft_existence
- only_presupposition
- possessed_definites_uniqueness
**Implicature**
- connectives
- gradable_adjective
- gradable_verb
- modals
- numerals_10_100
- numerals_2_3
- quantifiers
Each sentence type in IMPPRES is generated according to a template that specifies the linear order of the constituents in the sentence. The constituents are sampled from a vocabulary of over 3000 lexical items annotated with grammatical features needed to ensure wellformedness. We semiautomatically generate IMPPRES using a codebase developed by Warstadt et al. (2019a) and significantly expanded for the BLiMP dataset (Warstadt et al., 2019b).
Here is an instance of the raw presupposition data from any sub-dataset:
```buildoutcfg
{
"sentence1": "All ten guys that proved to boast might have been divorcing.",
"sentence2": "There are exactly ten guys that proved to boast.",
"trigger": "modal",
"presupposition": "positive",
"gold_label": "entailment",
"UID": "all_n_presupposition",
"pairID": "9e",
"paradigmID": 0
}
```
and the raw implicature data from any sub-dataset:
```buildoutcfg
{
"sentence1": "That teenager couldn't yell.",
"sentence2": "That teenager could yell.",
"gold_label_log": "contradiction",
"gold_label_prag": "contradiction",
"spec_relation": "negation",
"item_type": "control",
"trigger": "modal",
"lexemes": "can - have to"
}
```
### Data Fields
**Presupposition**
There is a slight mapping from the raw data fields in the presupposition sub-datasets and the fields appearing in the HuggingFace Datasets.
When dealing with the HF Dataset, the following mapping of fields happens:
```buildoutcfg
"premise" -> "sentence1"
"hypothesis"-> "sentence2"
"trigger" -> "trigger" or "Not_In_Example"
"trigger1" -> "trigger1" or "Not_In_Example"
"trigger2" -> "trigger2" or "Not_In_Example"
"presupposition" -> "presupposition" or "Not_In_Example"
"gold_label" -> "gold_label"
"UID" -> "UID"
"pairID" -> "pairID"
"paradigmID" -> "paradigmID"
```
For the most part, the majority of the raw fields remain unchanged. However, when it comes to the various `trigger` fields, a new mapping was introduced.
There are some examples in the dataset that only have the `trigger` field while other examples have the `trigger1` and `trigger2` field without the `trigger` or `presupposition` field.
Nominally, most examples look like the example in the Data Instances section above. Occassionally, however, some examples will look like:
```buildoutcfg
{
'sentence1': 'Did that committee know when Lissa walked through the cafe?',
'sentence2': 'That committee knew when Lissa walked through the cafe.',
'trigger1': 'interrogative',
'trigger2': 'unembedded',
'gold_label': 'neutral',
'control_item': True,
'UID': 'question_presupposition',
'pairID': '1821n',
'paradigmID': 95
}
```
In this example, `trigger1` and `trigger2` appear and `presupposition` and `trigger` are removed. This maintains the length of the dictionary.
To account for these examples, we have thus introduced the mapping above such that all examples accessed through the HF Datasets interface will have the same size as well as the same fields.
In the event that an example does not have a value for one of the fields, the field is maintained in the dictionary but given a value of `Not_In_Example`.
To illustrate this point, the example given in the Data Instances section above would look like the following in the HF Datasets:
```buildoutcfg
{
"premise": "All ten guys that proved to boast might have been divorcing.",
"hypothesis": "There are exactly ten guys that proved to boast.",
"trigger": "modal",
"trigger1": "Not_In_Example",
"trigger2": "Not_In_Example"
"presupposition": "positive",
"gold_label": "entailment",
"UID": "all_n_presupposition",
"pairID": "9e",
"paradigmID": 0
}
```
Below is description of the fields:
```buildoutcfg
"premise": The premise.
"hypothesis": The hypothesis.
"trigger": A detailed discussion of trigger types appears in the paper.
"trigger1": A detailed discussion of trigger types appears in the paper.
"trigger2": A detailed discussion of trigger types appears in the paper.
"presupposition": positive or negative.
"gold_label": Corresponds to entailment, contradiction, or neutral.
"UID": Unique id.
"pairID": Sentence pair ID.
"paradigmID": ?
```
It is not immediately clear what the difference is between `trigger`, `trigger1`, and `trigger2` is or what the `paradigmID` refers to.
**Implicature**
The `implicature` fields only have the mapping below:
```buildoutcfg
"premise" -> "sentence1"
"hypothesis"-> "sentence2"
```
Here is a description of the fields:
```buildoutcfg
"premise": The premise.
"hypothesis": The hypothesis.
"gold_label_log": Gold label for a logical reading of the sentence pair.
"gold_label_prag": Gold label for a pragmatic reading of the sentence pair.
"spec_relation": ?
"item_type": ?
"trigger": A detailed discussion of trigger types appears in the paper.
"lexemes": ?
```
### Data Splits
As the dataset was created to test already trained models, the only split that exists is for testing.
## Dataset Creation
### Curation Rationale
IMPPRES was created to evaluate how well trained NLI models recognize several classes of presuppositions and scalar implicatures.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
The annotations were generated semi-automatically.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
IMPPRES is available under a Creative Commons Attribution-NonCommercial 4.0 International Public License ("The License"). You may not use these files except in compliance with the License. Please see the LICENSE file for more information before you use the dataset.
### Citation Information
```buildoutcfg
@inproceedings{jeretic-etal-2020-natural,
title = "Are Natural Language Inference Models {IMPPRESsive}? {L}earning {IMPlicature} and {PRESupposition}",
author = "Jereti\v{c}, Paloma and
Warstadt, Alex and
Bhooshan, Suvrat and
Williams, Adina",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.768",
doi = "10.18653/v1/2020.acl-main.768",
pages = "8690--8705",
abstract = "Natural language inference (NLI) is an increasingly important task for natural language understanding, which requires one to infer whether a sentence entails another. However, the ability of NLI models to make pragmatic inferences remains understudied. We create an IMPlicature and PRESupposition diagnostic dataset (IMPPRES), consisting of 32K semi-automatically generated sentence pairs illustrating well-studied pragmatic inference types. We use IMPPRES to evaluate whether BERT, InferSent, and BOW NLI models trained on MultiNLI (Williams et al., 2018) learn to make pragmatic inferences. Although MultiNLI appears to contain very few pairs illustrating these inference types, we find that BERT learns to draw pragmatic inferences. It reliably treats scalar implicatures triggered by {``}some{''} as entailments. For some presupposition triggers like {``}only{''}, BERT reliably recognizes the presupposition as an entailment, even when the trigger is embedded under an entailment canceling operator like negation. BOW and InferSent show weaker evidence of pragmatic reasoning. We conclude that NLI training encourages models to learn some, but not all, pragmatic inferences.",
}
```
### Contributions
Thanks to [@aclifton314](https://github.com/aclifton314) for adding this dataset. |
Xnhyacinth/Image | ---
dataset_info:
- config_name: NQ
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: compressed_ctxs_1
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_5
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_10
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_20
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_50
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_100
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
splits:
- name: train
num_bytes: 6106425228
num_examples: 79168
- name: eval
num_bytes: 675422872
num_examples: 8757
- name: test
num_bytes: 279441134
num_examples: 3610
download_size: 3931027405
dataset_size: 7061289234
- config_name: TQA
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: answers
sequence: string
- name: target
dtype: string
- name: ctxs
list:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: compressed_ctxs_1
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_5
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_10
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_20
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_50
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_100
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
splits:
- name: train
num_bytes: 6116069275
num_examples: 78785
- name: eval
num_bytes: 685921423
num_examples: 8837
- name: test
num_bytes: 878592842
num_examples: 11313
download_size: 4438699237
dataset_size: 7680583540
- config_name: WQ
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: compressed_ctxs_5
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_10
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_20
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_50
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: compressed_ctxs_100
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
splits:
- name: train
num_bytes: 268644771
num_examples: 3478
- name: eval
num_bytes: 23143123
num_examples: 300
- name: test
num_bytes: 157146882
num_examples: 2032
download_size: 254281138
dataset_size: 448934776
configs:
- config_name: NQ
data_files:
- split: train
path: NQ/train-*
- split: eval
path: NQ/eval-*
- split: test
path: NQ/test-*
- config_name: TQA
data_files:
- split: train
path: TQA/train-*
- split: eval
path: TQA/eval-*
- split: test
path: TQA/test-*
- config_name: WQ
data_files:
- split: train
path: WQ/train-*
- split: eval
path: WQ/eval-*
- split: test
path: WQ/test-*
---
|
abdiharyadi/id_panl_bppt_with_amrbart_opus_mt_indobert_id_amr | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- id
- name: topic
dtype:
class_label:
names:
'0': Economy
'1': International
'2': Science
'3': Sport
- name: en_amr
dtype: string
- name: id_amr
dtype: string
splits:
- name: train
num_bytes: 583140
num_examples: 1220
download_size: 247241
dataset_size: 583140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "id_panl_bppt_with_amrbart_opus_mt_indobert_id_amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arbml/NArabizi | ---
dataset_info:
features:
- name: ID
dtype: string
- name: label
dtype:
class_label:
names:
0: NEU
1: NEG
2: MIX
3: POS
splits:
- name: test
num_bytes: 4034
num_examples: 144
- name: train
num_bytes: 27839
num_examples: 998
- name: validation
num_bytes: 3823
num_examples: 137
download_size: 12217
dataset_size: 35696
---
# Dataset Card for "NArabizi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_bigscience__bloom-560m | ---
pretty_name: Evaluation run of bigscience/bloom-560m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 13 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom-560m\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-04T13:05:03.033636](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-560m/blob/main/results_2023-12-04T13-05-03.033636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.003032600454890068,\n\
\ \"acc_stderr\": 0.0015145735612245468\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245468\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bigscience/bloom-560m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T01_44_51.787860
path:
- '**/details_harness|drop|3_2023-10-17T01-44-51.787860.parquet'
- split: 2023_10_19T07_58_25.532907
path:
- '**/details_harness|drop|3_2023-10-19T07-58-25.532907.parquet'
- split: 2023_10_19T11_57_26.532188
path:
- '**/details_harness|drop|3_2023-10-19T11-57-26.532188.parquet'
- split: 2023_10_19T13_58_30.472160
path:
- '**/details_harness|drop|3_2023-10-19T13-58-30.472160.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T13-58-30.472160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T01_44_51.787860
path:
- '**/details_harness|gsm8k|5_2023-10-17T01-44-51.787860.parquet'
- split: 2023_10_19T07_58_25.532907
path:
- '**/details_harness|gsm8k|5_2023-10-19T07-58-25.532907.parquet'
- split: 2023_10_19T11_57_26.532188
path:
- '**/details_harness|gsm8k|5_2023-10-19T11-57-26.532188.parquet'
- split: 2023_10_19T13_58_30.472160
path:
- '**/details_harness|gsm8k|5_2023-10-19T13-58-30.472160.parquet'
- split: 2023_12_03T15_01_55.935382
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-01-55.935382.parquet'
- split: 2023_12_03T15_02_09.067243
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-02-09.067243.parquet'
- split: 2023_12_03T16_04_42.088670
path:
- '**/details_harness|gsm8k|5_2023-12-03T16-04-42.088670.parquet'
- split: 2023_12_03T16_05_29.861058
path:
- '**/details_harness|gsm8k|5_2023-12-03T16-05-29.861058.parquet'
- split: 2023_12_04T09_54_26.106896
path:
- '**/details_harness|gsm8k|5_2023-12-04T09-54-26.106896.parquet'
- split: 2023_12_04T09_54_41.464190
path:
- '**/details_harness|gsm8k|5_2023-12-04T09-54-41.464190.parquet'
- split: 2023_12_04T13_04_03.136528
path:
- '**/details_harness|gsm8k|5_2023-12-04T13-04-03.136528.parquet'
- split: 2023_12_04T13_05_03.033636
path:
- '**/details_harness|gsm8k|5_2023-12-04T13-05-03.033636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T13-05-03.033636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:46.994927.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:46.994927.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T01_44_51.787860
path:
- '**/details_harness|winogrande|5_2023-10-17T01-44-51.787860.parquet'
- split: 2023_10_19T07_58_25.532907
path:
- '**/details_harness|winogrande|5_2023-10-19T07-58-25.532907.parquet'
- split: 2023_10_19T11_57_26.532188
path:
- '**/details_harness|winogrande|5_2023-10-19T11-57-26.532188.parquet'
- split: 2023_10_19T13_58_30.472160
path:
- '**/details_harness|winogrande|5_2023-10-19T13-58-30.472160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T13-58-30.472160.parquet'
- config_name: results
data_files:
- split: 2023_08_09T09_50_46.994927
path:
- results_2023-08-09T09:50:46.994927.parquet
- split: 2023_10_17T01_44_51.787860
path:
- results_2023-10-17T01-44-51.787860.parquet
- split: 2023_10_19T07_58_25.532907
path:
- results_2023-10-19T07-58-25.532907.parquet
- split: 2023_10_19T11_57_26.532188
path:
- results_2023-10-19T11-57-26.532188.parquet
- split: 2023_10_19T13_58_30.472160
path:
- results_2023-10-19T13-58-30.472160.parquet
- split: 2023_12_03T15_01_55.935382
path:
- results_2023-12-03T15-01-55.935382.parquet
- split: 2023_12_03T15_02_09.067243
path:
- results_2023-12-03T15-02-09.067243.parquet
- split: 2023_12_03T16_04_42.088670
path:
- results_2023-12-03T16-04-42.088670.parquet
- split: 2023_12_03T16_05_29.861058
path:
- results_2023-12-03T16-05-29.861058.parquet
- split: 2023_12_04T09_54_26.106896
path:
- results_2023-12-04T09-54-26.106896.parquet
- split: 2023_12_04T09_54_41.464190
path:
- results_2023-12-04T09-54-41.464190.parquet
- split: 2023_12_04T13_04_03.136528
path:
- results_2023-12-04T13-04-03.136528.parquet
- split: 2023_12_04T13_05_03.033636
path:
- results_2023-12-04T13-05-03.033636.parquet
- split: latest
path:
- results_2023-12-04T13-05-03.033636.parquet
---
# Dataset Card for Evaluation run of bigscience/bloom-560m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloom-560m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom-560m",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T13:05:03.033636](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-560m/blob/main/results_2023-12-04T13-05-03.033636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245468
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245468
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sunzeyeah/chinese_chatgpt_corpus | ---
annotations_creators:
- no-annotation
language_creators:
- unknown
language:
- zh
license:
- unknown
multilinguality:
- monolingual
pretty_name: Chinese-ChatGPT-Corpus
size_categories:
- 5M<n<10M
task_categories:
- text-generation
- text2text-generation
- question-answering
- reinforcement-learning
task_ids:
- language-modeling
- masked-language-modeling
---
# Dataset Card for chinese_chatgpt_corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Size of downloaded dataset files:** 5.05 GB
- **Size of the generated dataset:** 0 GB
- **Total amount of disk used:** 5.05 GB
### Dataset Summary
This repo collects chinese corpus for Supervised Finetuning (SFT) and Reinforcement Learning From Human Feedback (RLHF).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
Chinese
## Dataset Structure
### Data Instances
#### train_data_external_v1.jsonl
- **Size of downloaded dataset files:** 5.04 GB
- **Size of the generated dataset:** 0 GB
- **Total amount of disk used:** 5.04 GB
An example looks as follows:
```
{
"prompt": "问题:有没有给未成年贷款的有的联系",
"answers":
[
{
"answer": "若通过招行办理,我行规定,贷款人年龄需年满18岁,且年龄加贷款年限不得超过70岁。如果您持有我行信用卡附属卡,可尝试办理预借现金。",
"score": 1
}
],
"prefix": "回答:"
}
```
#### dev_data_external_v1.jsonl
- **Size of downloaded dataset files:** 9.55 MB
- **Size of the generated dataset:** 0 MB
- **Total amount of disk used:** 9.55 MB
An example looks as follows:
```
{
"prompt": "初学纹发现1/2\"的管螺纹并不是1\"的一半。不知道其中的原因,请各位指点。",
"answers":
[
{
"answer": "管螺纹的名义尺寸是“管子”的孔(内)径,而管子的壁厚不是两倍。所以,1/2\"的管螺纹并不是1\"的一半,",
"score": 1
}
],
"prefix": "回答:"
}
```
### Data Fields
The data fields are the same among all splits.
#### train_data_external_v1.jsonl
- `prompt`: prompt, `string`
- `answers`: list of answers
- `answer`: answer, `string`
- `score`: score of answer, `int`
- `prefix`: prefix to the answer, `string`
#### dev_data_external_v1.jsonl
- `prompt`: prompt, `string`
- `answers`: list of answers
- `answer`: answer, `string`
- `score`: score of answer, `int`
- `prefix`: prefix to the answer, `string`
### Data Splits
| name | train |
|----------|-------:|
|train_data_external_v1.jsonl|5477982|
|dev_data_external_v1.jsonl|10000|
## Dataset Creation
### Curation Rationale
Link to github: [data_prepare](https://github.com/sunzeyeah/RLHF/blob/master/src/data_prepare.py)
### Source Data
#### Initial Data Collection and Normalization
- [百科](https://github.com/brightmart/nlp_chinese_corpus)
- [知道问答](https://github.com/SophonPlus/ChineseNlpCorpus)
- [对联](https://github.com/wb14123/couplet-dataset/releases/download/1.0/couplet.tar.gz)
- [古文](https://github.com/NiuTrans/Classical-Modern)
- [古诗词](https://github.com/chinese-poetry/chinese-poetry)
- 微博新闻评论
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
medalpaca/medical_meadow_wikidoc_patient_information | ---
license: cc
task_categories:
- question-answering
language:
- en
---
# Dataset Card for WikiDoc
For the dataset containing rephrased content from the living textbook refer to [this dataset](https://huggingface.co/datasets/medalpaca/medical_meadow_wikidoc)
## Dataset Description
- **Source:** https://www.wikidoc.org/index.php/Main_Page
- **Repository:** https://github.com/kbressem/medalpaca
- **Paper:** TBA
### Dataset Summary
This dataset containes medical question-answer pairs extracted from [WikiDoc](https://www.wikidoc.org/index.php/Main_Page),
a collaborative platform for medical professionals to share and contribute to up-to-date medical knowledge.
The platform has to main subsites, the "Living Textbook" and "Patient Information". The "Living Textbook"
contains chapters for various medical specialties, which we crawled. We then used GTP-3.5-Turbo to rephrase
the paragraph heading to a question and used the paragraph as answer. Patient Information is structured differently,
in that each section subheading is already a question, making rephrasing them obsolete.
**Note:** This dataset is still a WIP. While the Q/A pairs from the patient information seems to be mostly correct,
the conversion using GPT-3.5-Turbo yielded some unsatisfactory results in approximately 30% of cases. We are in the process of cleaning this dataset.
### Citation Information
TBA |
bigbio/ebm_pico |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: EBM NLP
homepage: https://github.com/bepnye/EBM-NLP
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for EBM NLP
## Dataset Description
- **Homepage:** https://github.com/bepnye/EBM-NLP
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER
This corpus release contains 4,993 abstracts annotated with (P)articipants,
(I)nterventions, and (O)utcomes. Training labels are sourced from AMT workers and
aggregated to reduce noise. Test labels are collected from medical professionals.
## Citation Information
```
@inproceedings{nye-etal-2018-corpus,
title = "A Corpus with Multi-Level Annotations of Patients, Interventions and Outcomes to Support Language Processing for Medical Literature",
author = "Nye, Benjamin and
Li, Junyi Jessy and
Patel, Roma and
Yang, Yinfei and
Marshall, Iain and
Nenkova, Ani and
Wallace, Byron",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2018",
address = "Melbourne, Australia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P18-1019",
doi = "10.18653/v1/P18-1019",
pages = "197--207",
}
```
|
open-llm-leaderboard/details_sail__Sailor-7B-Chat | ---
pretty_name: Evaluation run of sail/Sailor-7B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sail/Sailor-7B-Chat](https://huggingface.co/sail/Sailor-7B-Chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sail__Sailor-7B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T06:48:11.877072](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B-Chat/blob/main/results_2024-03-11T06-48-11.877072.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5592083287688164,\n\
\ \"acc_stderr\": 0.033829004333336674,\n \"acc_norm\": 0.5649028875539783,\n\
\ \"acc_norm_stderr\": 0.03453333687790441,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4409085499905092,\n\
\ \"mc2_stderr\": 0.014918521172966043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48976109215017066,\n \"acc_stderr\": 0.014608326906285019,\n\
\ \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.014595873205358269\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5538737303326031,\n\
\ \"acc_stderr\": 0.0049607323822552455,\n \"acc_norm\": 0.7501493726349333,\n\
\ \"acc_norm_stderr\": 0.004320416477957659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534323,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534323\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.01535721266582947,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.01535721266582947\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527822,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527822\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110303,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402612,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402612\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289262,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289262\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361,\n \"mc2\": 0.4409085499905092,\n\
\ \"mc2_stderr\": 0.014918521172966043\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754027\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30401819560272936,\n \
\ \"acc_stderr\": 0.012670420440198659\n }\n}\n```"
repo_url: https://huggingface.co/sail/Sailor-7B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|arc:challenge|25_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|gsm8k|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hellaswag|10_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-48-11.877072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T06-48-11.877072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- '**/details_harness|winogrande|5_2024-03-11T06-48-11.877072.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T06-48-11.877072.parquet'
- config_name: results
data_files:
- split: 2024_03_11T06_48_11.877072
path:
- results_2024-03-11T06-48-11.877072.parquet
- split: latest
path:
- results_2024-03-11T06-48-11.877072.parquet
---
# Dataset Card for Evaluation run of sail/Sailor-7B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sail/Sailor-7B-Chat](https://huggingface.co/sail/Sailor-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sail__Sailor-7B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T06:48:11.877072](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-7B-Chat/blob/main/results_2024-03-11T06-48-11.877072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5592083287688164,
"acc_stderr": 0.033829004333336674,
"acc_norm": 0.5649028875539783,
"acc_norm_stderr": 0.03453333687790441,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4409085499905092,
"mc2_stderr": 0.014918521172966043
},
"harness|arc:challenge|25": {
"acc": 0.48976109215017066,
"acc_stderr": 0.014608326906285019,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.014595873205358269
},
"harness|hellaswag|10": {
"acc": 0.5538737303326031,
"acc_stderr": 0.0049607323822552455,
"acc_norm": 0.7501493726349333,
"acc_norm_stderr": 0.004320416477957659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534323,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.01535721266582947,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.01535721266582947
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527822,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527822
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110303,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402612,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402612
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289262,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289262
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361,
"mc2": 0.4409085499905092,
"mc2_stderr": 0.014918521172966043
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754027
},
"harness|gsm8k|5": {
"acc": 0.30401819560272936,
"acc_stderr": 0.012670420440198659
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pawlo2013/one_piece_dataset | ---
dataset_info:
features:
- name: full_colour
dtype: image
- name: sketch
dtype: image
splits:
- name: train
num_bytes: 170204480.0
num_examples: 922
download_size: 170225532
dataset_size: 170204480.0
---
# Dataset Card for "one_piece_dataset"
<h1> This dataset contains 922 images taken from the one piece anime, with each row containing the coloured image and the sketch one. </h1>
<img alt='example images' src="./combined.png"/ >
<h2> Example Setup (the images are not normalized) </h2>
<code>
transform = Compose([
transforms.Resize((128, 128)),
transforms.ToTensor(),
transforms.Lambda(lambda t: (t * 2) - 1)
])
def train():
def transforms(examples):
examples["sketch_pixel_values"] = [transform(image.convert("RGB")) for image in examples["sketch"]]
examples["full_colour_pixel_values"] = [ transform(image.convert("RGB")) for image in examples["full_colour"]]
del examples["sketch"]
del examples["full_colour"]
return examples
dataset = load_dataset("pawlo2013/one_piece_dataset", split="train")
transformed_full_colour_dataset = dataset.with_transform(transforms)
dataloader = DataLoader(transformed_full_colour_dataset, batch_size=16, shuffle=True, num_workers=0)
</code> |
MLP-Lemma/lemma-crl-data | ---
dataset_info:
features:
- name: cut_llama_input_ids
sequence:
sequence: int64
- name: cut_text
dtype: string
splits:
- name: train
num_bytes: 14815894322
num_examples: 74004228
download_size: 5590871486
dataset_size: 14815894322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 23207740
num_examples: 2351
download_size: 6567010
dataset_size: 23207740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DanielSongShen/CLIP-food101-image-dataset-small_latents | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: CLIP_image_latent
sequence:
sequence: float32
splits:
- name: train
num_bytes: 179491954.0
num_examples: 4000
- name: test
num_bytes: 45232386.0
num_examples: 1000
download_size: 229755095
dataset_size: 224724340.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pranjalipathre/i2i | ---
dataset_info:
config_name: video_01
features:
- name: initial_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 3604
num_examples: 10
download_size: 548787
dataset_size: 3604
--- |
mkarots/test_repo | ---
license: mit
language:
- en
tags:
- test
- dev
pretty_name: testing-dataset
size_categories:
- n<1K
--- |
ibrahimahmood/PIDRAY | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': images
'1': labels
splits:
- name: train
num_bytes: 6424849.0
num_examples: 60
download_size: 6415751
dataset_size: 6424849.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
result-kand2-sdxl-wuerst-karlo/023acaec | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 233
num_examples: 10
download_size: 1392
dataset_size: 233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "023acaec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
foilfoilfoil/LaminiChatML1024 | ---
license: other
---
|
blanchon/AID | ---
license:
- unknown
task_categories:
- image-classification
language:
- en
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
pretty_name: Aerial Image Dataset (AID)
size_categories:
- 1K<n<10K
---
# Aerial Image Dataset (AID)

## Description
The Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using [Google Earth](https://earth.google.com/web/) and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class.
The new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes.
The dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture.
## Details
## Structure
```tree
.
├── README.md
└── data
├── Airport
│ ├── airport_1.png
│ ├── airport_2.png
│ ├── ...
│ └── airport_360.png
├── BareLand
│ ├── bareland_1.png
│ ├── ...
│ └── bareland_310.png
├── ...
└── Viaduct
```
### Statistics
- Total Number of Images: 10,000
- Image Resolution: 600x600 pixels
- Scene Categories: 30
- Dataset Size: 2.6GB
## Citation
If you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication:
```bibtex
@article{xia2017aid,
title = {AID: A benchmark data set for performance evaluation of aerial scene classification},
author = {Xia, Gui-Song and Hu, Jingwen and Hu, Fan and Shi, Baoguang and Bai, Xiang and Zhong, Yanfei and Zhang, Liangpei and Lu, Xiaoqiang},
journal = {IEEE Transactions on Geoscience and Remote Sensing},
volume = {55},
number = {7},
pages = {3965-3981},
year = {2017},
publisher = {IEEE}
}
```
Paper with code: https://paperswithcode.com/dataset/aid |
liuyanchen1015/VALUE_wnli_got | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 5514
num_examples: 20
- name: train
num_bytes: 969
num_examples: 6
download_size: 7711
dataset_size: 6483
---
# Dataset Card for "VALUE_wnli_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
glaiveai/glaive-function-calling-v2 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 100K<n<1M
--- |
neeva/query2query_evaluation | ---
task_categories:
- sentence-similarity
---
## Description
This dataset contains triples of the form "query1", "query2", "label" where labels are mapped as follows
- similar: 1
- not similar: 0
- ambiguous: -1 |
minimindy/lora-checkpoint-50 | ---
library_name: peft
base_model: baffo32/decapoda-research-llama-7B-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.6.3.dev0 |
newsph_nli | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- tl
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
paperswithcode_id: newsph-nli
pretty_name: NewsPH NLI
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 154510599
num_examples: 420000
- name: test
num_bytes: 3283665
num_examples: 9000
- name: validation
num_bytes: 33015530
num_examples: 90000
download_size: 76565287
dataset_size: 190809794
---
# Dataset Card for NewsPH NLI
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NewsPH NLI homepage](https://github.com/jcblaisecruz02/Filipino-Text-Benchmarks)
- **Repository:** [NewsPH NLI repository](https://github.com/jcblaisecruz02/Filipino-Text-Benchmarks)
- **Paper:** [Arxiv paper](https://arxiv.org/pdf/2010.11574.pdf)
- **Leaderboard:**
- **Point of Contact:** [Jan Christian Cruz](mailto:jan_christian_cruz@dlsu.edu.ph)
### Dataset Summary
First benchmark dataset for sentence entailment in the low-resource Filipino language. Constructed through exploting the structure of news articles. Contains 600,000 premise-hypothesis pairs, in 70-15-15 split for training, validation, and testing.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset contains news articles in Filipino (Tagalog) scraped rom all major Philippine news sites online.
## Dataset Structure
### Data Instances
Sample data:
{
"premise": "Alam ba ninyo ang ginawa ni Erap na noon ay lasing na lasing na rin?",
"hypothesis": "Ininom niya ang alak na pinagpulbusan!",
"label": "0"
}
### Data Fields
[More Information Needed]
### Data Splits
Contains 600,000 premise-hypothesis pairs, in 70-15-15 split for training, validation, and testing.
## Dataset Creation
### Curation Rationale
We propose the use of news articles for automatically creating benchmark datasets for NLI because of two reasons. First, news articles commonly use single-sentence paragraphing, meaning every paragraph in a news article is limited to a single sentence. Second, straight news articles follow the “inverted pyramid” structure, where every succeeding paragraph builds upon the premise of those that came before it, with the most important information on top and the least important towards the end.
### Source Data
#### Initial Data Collection and Normalization
To create the dataset, we scrape news articles from all major Philippine news sites online. We collect a total of 229,571 straight news articles, which we then lightly preprocess to remove extraneous unicode characters and correct minimal misspellings. No further preprocessing is done to preserve information in the data.
#### Who are the source language producers?
The dataset was created by Jan Christian, Blaise Cruz, Jose Kristian Resabal, James Lin, Dan John Velasco, and Charibeth Cheng from De La Salle University and the University of the Philippines
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
Jan Christian Blaise Cruz, Jose Kristian Resabal, James Lin, Dan John Velasco and Charibeth Cheng
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[Jan Christian Blaise Cruz] (mailto:jan_christian_cruz@dlsu.edu.ph)
### Licensing Information
[More Information Needed]
### Citation Information
@article{cruz2020investigating,
title={Investigating the True Performance of Transformers in Low-Resource Languages: A Case Study in Automatic Corpus Creation},
author={Jan Christian Blaise Cruz and Jose Kristian Resabal and James Lin and Dan John Velasco and Charibeth Cheng},
journal={arXiv preprint arXiv:2010.11574},
year={2020}
}
### Contributions
Thanks to [@anaerobeth](https://github.com/anaerobeth) for adding this dataset. |
nisaar/Lawyer_GPT_India | ---
license: apache-2.0
---
**Dataset Card for Indian Polity Question-Answer Dataset**
---
**Dataset Summary**
This dataset contains a collection of question-answer pairs on the subject of Indian Polity. The aim is to provide comprehensive answers to a wide range of questions pertaining to the Indian Constitution, judiciary, legislative, and various socio-political issues in India. It serves as a valuable resource for learners, researchers, and AI systems seeking to understand or respond to questions about Indian Polity.
---
**Supported Tasks and Leaderboards**
This dataset is useful for tasks such as question answering, text comprehension, language modelling, and conversational AI development. There's no specific leaderboard associated with this dataset.
---
**Languages**
The dataset is in English.
---
**Dataset Structure**
- **Data Instances**
Each instance in the dataset consists of a pair of a human-posed question and an assistant-provided answer on a specific topic in Indian Polity.
- **Data Fields**
1. Question: A text field containing the question.
2. Answer: A text field containing the corresponding answer.
- **Data Splits**
The dataset isn't divided into standard splits of training, validation, and test sets.
---
**Dataset Creation**
- **Curation Rationale**
The dataset was curated to provide accurate and comprehensive answers to a range of questions about Indian Polity. It covers fundamental rights, constitutional provisions, legislative procedures, and socio-political issues, among others.
- **Source Data**
- **Initial Data Collection and Normalization**
Data collection involved generating questions on Indian Polity topics and providing detailed answers.
- **Who are the source language producers?**
The language was produced by a language model trained by OpenAI.
---
**Annotations**
- **Annotation process**
Not applicable as the dataset doesn't contain annotations.
- **Who are the annotators?**
Not applicable as the dataset doesn't contain annotations.
---
**Personal and Sensitive Information**
The dataset does not contain any personal or sensitive information.
---
**Considerations for Using the Data**
- **Social Impact of Dataset**
The dataset can contribute to the understanding of Indian Polity and Constitution. It can help in educational, research, and AI applications.
- **Discussion of Biases**
There is no obvious bias in the dataset as it provides factual information related to the Indian Constitution and Polity.
- **Other Known Limitations**
The dataset may not cover all possible questions on Indian Polity. Additionally, all answers are in English, which may limit its use for non-English speakers.
---
**Additional Information**
- **Dataset Curators**
The dataset has been curated by an OpenAI language model.
- **Licensing Information**
The dataset follows OpenAI's standard data use policy.
- **Citation Information**
Not applicable as this is an artificial dataset.
- **Contributions**
The dataset was generated by the ChatGPT model trained by OpenAI. |
DIAS123/JUNIOR | ---
license: openrail
---
|
MohammedNasri/prepared_train | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 39269002992
num_examples: 40880
download_size: 6186932503
dataset_size: 39269002992
---
# Dataset Card for "prepared_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/reranking-arabic-medicalqa | ---
dataset_info:
features:
- name: positive
sequence: string
- name: negative
sequence: string
- name: query
dtype: string
splits:
- name: test
num_bytes: 56187343
num_examples: 17552
download_size: 9654152
dataset_size: 56187343
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
polinaeterna/hh-rlhf | ---
configs:
- config_name: all
default: true
data_files:
- split: train
path:
- "harmless-base/train*"
- "helpful-*/train*"
- split: test
path:
- "harmless-base/test*"
- "helpful-*/test*"
- config_name: harmless-base
data_dir: harmless-base
- config_name: helpful-base
data_dir: helpful-base
- config_name: helpful-online
data_dir: helpful-online
- config_name: helpful-rejection-sampled
data_dir: helpful-rejection-sampled
- config_name: red-team-attempts
data_dir: red-team-attempts
license: mit
tags:
- human-feedback
duplicated_from: Anthropic/hh-rlhf
---
# Dataset Card for HH-RLHF
## Dataset Summary
This repository provides access to two different kinds of data:
1. Human preference data about helpfulness and harmlessness from [Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback](https://arxiv.org/abs/2204.05862). These data are meant to train preference (or reward) models for subsequent RLHF training. These data are *not* meant for supervised training of dialogue agents. Training dialogue agents on these data is likely to lead to harmful models and this shold be avoided.
2. Human-generated and annotated red teaming dialogues from [Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned](https://www.anthropic.com/red_teaming.pdf). These data are meant to understand how crowdworkers red team models and what types of red team attacks are succesful or not. The data are *not* meant for fine-tuning or preference modeling (use the data above for preference modeling). These data are entire transcripts of conversations that are derived from the harmlessness preference modeling data described above, where only the chosen response is incorporated into the overall transcript. Furthermore, the transcripts are annotated with human and automated measurements of how harmful the overall dialogues are.
**Disclaimer**: The data (especially the harmlessness preference data and the red team data) contain content that may be offensive or upsetting. Topics include, but are not limited to, discriminatory language and discussions of abuse, violence, self-harm, exploitation, and other potentially upsetting subject matter. Please only engage with the data in accordance with your own personal risk tolerance. The data are intended for research purposes, especially research that can make models *less* harmful. The views expressed in the data do not reflect the views of Anthropic or any of its employees. As mentioned above, these data are *not* intended for training dialogue agents as this will likely lead to harmful model behavior.
Each of these datasets are described further below.
## Human preference data about helpfulness and harmlessness (PM Data)
The data are described in the paper: [Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback](https://arxiv.org/abs/2204.05862). If you find the data useful, please cite the paper. The data format is very simple -- each line of the jsonl files contains a pair of texts, one "chosen" and one "rejected".
For **helpfulness**, the data are grouped into train/test splits in three tranches: from our base models (context-distilled 52B language models), via rejection sampling (mostly with best-of-16 sampling) against an early preference model, and a dataset sampled during our iterated "online" process.
For **harmlessness**, the data are only collected for our base models, but otherwise formatted in the same way.
Details about the data collection process and crowdworker population can be found in the paper, specifically in section 2 and appendix D.
## Red teaming data (not PM Data)
The data are described in the paper: [Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned](https://www.anthropic.com/red_teaming.pdf). If you find the data useful, please cite the paper. At a high level these data can be examined to understand what succesful red team attempts look like.
Details about the data and data collection procedures can be found in the Datasheet in the appendix of the paper.
Each line of the jsonl file contains a dictionary with the following fields:
- `transcript` a text transcript of a conversation between a human adversary (red team member) and an AI assistant
- `min_harmlessness_score_transcript` a real value score of the harmlessness of the AI assistant (lower is more harmful) as obtained from a preference model
- `num_params` number of parameters in the language model powering the AI assistant
- `model_type` type of model powering the AI assistant
- `rating` the red team member's rating of how successful they were at breaking the AI assistant (Likert scale, higher is more successful)
- `task_description` a short text description written by the red team member about how they tried to red team the AI assistant
- `task_description_harmlessness_score` a real value score of the harmlessness of the task description (lower is more harmful) as obtained from a preference model
- `red_team_member_id` an arbitrary identifier of the red team member. one red team member can generate multiple red team attacks
- `is_upworker` a binary indicator that is true if the red team member was from the crowd platform Upwork or false if they were from MTurk
- `tags` a list of up to 6 tags per transcript. tags are short descriptions of the red team attempts generated by crowdworkers who reviewed red team data post-hoc. tags were only provided for a random sample of 1000 red team attempts for two of four model types.
## Usage
Each of the above datasets is located in a separate sub-directory. To load an individual subset, use the `data_dir` argument of the `load_dataset()` function as follows:
```python
from datasets import load_dataset
# Load all helpfulness/harmless subsets (share the same schema)
dataset = load_dataset("Anthropic/hh-rlhf")
# Load one of the harmless subsets
dataset = load_dataset("Anthropic/hh-rlhf", data_dir="harmless-base")
# Load the red teaming subset
dataset = load_dataset("Anthropic/hh-rlhf", data_dir="red-team-attempts")
```
## Contact
The original authors host this dataset on GitHub here: https://github.com/anthropics/hh-rlhf
You can submit inquiries to: redteam@anthropic.com |
CyberHarem/fiora_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fiora (Fire Emblem)
This is the dataset of fiora (Fire Emblem), containing 98 images and their tags.
The core tags of this character are `long_hair, blue_eyes, breasts, aqua_hair, blue_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 98 | 98.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiora_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 98 | 63.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiora_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 186 | 117.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiora_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 98 | 89.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiora_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 186 | 154.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiora_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fiora_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, breastplate, fingerless_gloves, thighhighs, belt, smile, looking_at_viewer, spear, thigh_boots, pegasus_knight_uniform_(fire_emblem), holding |
| 1 | 23 |  |  |  |  |  | 1girl, hair_flower, bikini, solo, cleavage, navel, smile, looking_at_viewer, open_mouth, umbrella, bare_shoulders, holding, blue_sky, cloud, day, medium_breasts, simple_background, blush, outdoors |
| 2 | 17 |  |  |  |  |  | hetero, blush, nipples, open_mouth, solo_focus, 1girl, penis, vaginal, nude, 1boy, navel, sweat, girl_on_top, mosaic_censoring, thighhighs, cum_in_pussy, fingerless_gloves, headband, straddling, female_pubic_hair, group_sex, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | breastplate | fingerless_gloves | thighhighs | belt | smile | looking_at_viewer | spear | thigh_boots | pegasus_knight_uniform_(fire_emblem) | holding | hair_flower | bikini | cleavage | navel | open_mouth | umbrella | bare_shoulders | blue_sky | cloud | day | medium_breasts | simple_background | blush | outdoors | hetero | nipples | solo_focus | penis | vaginal | nude | 1boy | sweat | girl_on_top | mosaic_censoring | cum_in_pussy | headband | straddling | female_pubic_hair | group_sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:--------------------|:-------------|:-------|:--------|:--------------------|:--------|:--------------|:---------------------------------------|:----------|:--------------|:---------|:-----------|:--------|:-------------|:-----------|:-----------------|:-----------|:--------|:------|:-----------------|:--------------------|:--------|:-----------|:---------|:----------|:-------------|:--------|:----------|:-------|:-------|:--------|:--------------|:-------------------|:---------------|:-----------|:-------------|:--------------------|:------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | X | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Sofoklis/gc_random | ---
dataset_info:
features:
- name: number
dtype: int64
- name: name
dtype: string
- name: sequence
dtype: string
- name: spaced_sequence
dtype: string
- name: array
sequence:
sequence: float64
- name: image
dtype: image
splits:
- name: train
num_bytes: 43557152.4
num_examples: 90
- name: test
num_bytes: 4839683.6
num_examples: 10
- name: valid
num_bytes: 8711430.48
num_examples: 18
download_size: 11731693
dataset_size: 57108266.480000004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Jhonjhorg/udacity-bangladesh | ---
license: afl-3.0
language:
- en
pretty_name: udacity-bangladesh
size_categories:
- 100M<n<1B
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
brontomind/ethicaltextclassifier | ---
license: apache-2.0
tags:
- text classification
pretty_name: Ethical Text Classifier
size_categories:
- 1K<n<10K
--- |
Venafi/Machine-Identity-Spectra | ---
license: apache-2.0
task_categories:
- feature-extraction
language:
- en
tags:
- certificates
- machine identity
- security
size_categories:
- 10M<n<100M
pretty_name: Machine Identity Spectra Dataset
configs:
- config_name: sample_data
data_files: Data/CertificateFeatures-sample.parquet
---
# Machine Identity Spectra Dataset
<img src="https://huggingface.co/datasets/Venafi/Machine-Identity-Spectra/resolve/main/VExperimentalSpectra.svg" alt="Spectra Dataset" width="250">
## Summary
Venafi is excited to release of the Machine Identity Spectra large dataset.
This collection of data contains extracted features from 19m+ certificates discovered over HTTPS (port 443) on the
public internet between July 20 and July 26, 2023.
The features are a combination of X.509 certificate features, RFC5280 compliance checks,
and other attributes intended to be used for clustering, features analysis, and a base for supervised learning tasks (labels not included).
Some rows may contain nan values as well and as such could require additional pre-processing for certain tasks.
This project is part of Venafi Athena. Venafi is committed to enabling the data science community to increase the adoption of machine learning techniques
to identify machine identity threats and solutions.
Phillip Maraveyias at Venafi is the lead researcher for this dataset.
## Data Structure
The extracted features are contained in the Data folder as certificateFeatures.csv.gz. The unarchived data size is
approximately 10GB and contains 98 extracted features for approximately 19m certificates. A description of the features
and expected data types is contained in the base folder as features.csv.
The Data folder also contains a 500k row sample of the data in parquet format. This is displayed in the Data Viewer
for easy visual inspection of the dataset.
## Clustering and PCA Example
To demonstrate a potential use of the data, clustering and Principal Component Analysis (PCA) were
conducted on the binary data features in the dataset. 10 clusters were generated and PCA conducted with the top 3 components preserved.
KMeans clustering was performed to generate a total of 10 clusters. In this case we are primarily
interested in visualizing the data and understanding better how it may be used, so the choice of 10 clusters is mostly
for illustrative purposes.
The top three PCA components accounted for approximately 61%, 10%, and 6% of the total explained variance
(for a total of 77% of the overall data variance). Plots of the first 2 components in 2D space and top 3 components in
3D space grouped into the 10 clusters are shown below.
### Clusters in 2 Dimensions

### Clusters in 3 Dimensions

## Contact
Please contact athena-community@venafi.com if you have any questions about this dataset.
## References and Acknowledgement
The following papers provided inspiration for this project:
- Li, J.; Zhang, Z.; Guo, C. Machine Learning-Based Malicious X.509 Certificates’ Detection. Appl. Sci. 2021, 11, 2164. https://doi.org/ 10.3390/app11052164
- Liu, J.; Luktarhan, N.; Chang, Y.; Yu, W. Malcertificate: Research and Implementation of a Malicious Certificate Detection Algorithm Based on GCN. Appl. Sci. 2022,12,4440. https://doi.org/ 10.3390/app12094440 |
namespace-Pt/msmarco-corpus | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 3243246889
num_examples: 8841823
download_size: 1720789558
dataset_size: 3243246889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "msmarco-corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/19100_chat_8x_slot | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 78287
num_examples: 512
- name: validation
num_bytes: 4887
num_examples: 32
- name: test
num_bytes: 570513
num_examples: 3731
download_size: 0
dataset_size: 653687
---
# Dataset Card for "19100_chat_8x_slot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
firopyomyo/gggg | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': conditioning_images
'1': images
splits:
- name: train
num_bytes: 9235.0
num_examples: 2
download_size: 6697
dataset_size: 9235.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AIARTCHAN/lora-Hyundai_Equus_1999 | ---
license: creativeml-openrail-m
tags:
- lora
- aiartchan
- stable-diffusion
pretty_name: Hyundai Equus 1999
---
# Hyundai Equus 1999
현대 에쿠스 1세대 로라
가중치 0.8 ~ 1 권장
[다운로드 (151MB)](https://huggingface.co/datasets/AIARTCHAN/lora-Hyundai_Equus_1999/resolve/main/Equus_1-000006.safetensors)
|
heekhero/DTL_datasets | ---
license: mit
---
|
autoevaluate/autoeval-eval-futin__feed-sen_en_-1de085-2240171542 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b7
metrics: []
dataset_name: futin/feed
dataset_config: sen_en_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b7
* Dataset: futin/feed
* Config: sen_en_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
musabg/wizard_vicuna_70k_unfiltered_de | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 159146233
num_examples: 34598
download_size: 79402352
dataset_size: 159146233
---
# Dataset Card for "wizard_vicuna_70k_unfiltered_de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amazingvince/full-dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 603744333.9991473
num_examples: 184597
- name: test
num_bytes: 6099683.000852721
num_examples: 1865
download_size: 344241874
dataset_size: 609844017.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_11_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 193615
num_examples: 6699
download_size: 124007
dataset_size: 193615
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_11_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/daily_dialog_gloss_FINAL | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: gloss
dtype: string
splits:
- name: train
num_bytes: 11195623
num_examples: 77350
download_size: 7028108
dataset_size: 11195623
---
# Dataset Card for "daily_dialog_gloss_FINAL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/kids_fashion_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 391358850
num_examples: 1000000
download_size: 51725108
dataset_size: 391358850
---
# Dataset Card for "kids_fashion_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yancao6/azureml_data_test_gitdatastore | ---
license: mit
---
|
guillezala/dataset1 | ---
license: mit
---
|
roskyluo/stanford_cars_blip | ---
license: apache-2.0
---
|
seedboxai/german_to_english_translations_v1 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: tokens
dtype: string
- name: range
dtype: string
- name: text
dtype: string
- name: original
dtype: string
- name: translation
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8534323299
num_examples: 1347167
- name: test
num_bytes: 947334755
num_examples: 149686
download_size: 5266655381
dataset_size: 9481658054
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Dmitriy007/Socrat | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 37517232.0
num_examples: 11994
- name: val
num_bytes: 6621976.0
num_examples: 2117
download_size: 16725921
dataset_size: 44139208.0
---
# Dataset Card for "Socrat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-1.3B](https://huggingface.co/cerebras/Cerebras-GPT-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T21:53:03.543660](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B/blob/main/results_2023-10-16T21-53-03.543660.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00041946308724832214,\n\
\ \"em_stderr\": 0.00020969854707829098,\n \"f1\": 0.03696203859060411,\n\
\ \"f1_stderr\": 0.0010536462556224307,\n \"acc\": 0.26830376029292,\n\
\ \"acc_stderr\": 0.007665737673204982\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.00020969854707829098,\n\
\ \"f1\": 0.03696203859060411,\n \"f1_stderr\": 0.0010536462556224307\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148673984\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5343330702446725,\n \"acc_stderr\": 0.014019317531542565\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T21_53_03.543660
path:
- '**/details_harness|drop|3_2023-10-16T21-53-03.543660.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T21-53-03.543660.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T21_53_03.543660
path:
- '**/details_harness|gsm8k|5_2023-10-16T21-53-03.543660.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T21-53-03.543660.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:08:05.365000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:08:05.365000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T21_53_03.543660
path:
- '**/details_harness|winogrande|5_2023-10-16T21-53-03.543660.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T21-53-03.543660.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_08_05.365000
path:
- results_2023-07-18T11:08:05.365000.parquet
- split: 2023_10_16T21_53_03.543660
path:
- results_2023-10-16T21-53-03.543660.parquet
- split: latest
path:
- results_2023-10-16T21-53-03.543660.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-1.3B](https://huggingface.co/cerebras/Cerebras-GPT-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T21:53:03.543660](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B/blob/main/results_2023-10-16T21-53-03.543660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829098,
"f1": 0.03696203859060411,
"f1_stderr": 0.0010536462556224307,
"acc": 0.26830376029292,
"acc_stderr": 0.007665737673204982
},
"harness|drop|3": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829098,
"f1": 0.03696203859060411,
"f1_stderr": 0.0010536462556224307
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148673984
},
"harness|winogrande|5": {
"acc": 0.5343330702446725,
"acc_stderr": 0.014019317531542565
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
juancopi81/mls | ---
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 2690661
num_examples: 142
download_size: 1117834
dataset_size: 2690661
tags:
- whisper
- whispering
- medium
---
# Dataset Card for "mls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/mrqa_v2_entity | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: query_embedding
sequence: float32
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
splits:
- name: train
num_bytes: 430961474
num_examples: 105316
download_size: 452634359
dataset_size: 430961474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.