datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
A-Bar/vi-ar_non_top_cs_dev | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 44795482
num_examples: 100000
download_size: 17805008
dataset_size: 44795482
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vi-ar_non_top_cs_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
silk-road/IMDB-Face-Recognition | ---
task_categories:
- image-classification
dataset_info:
config_name: imdb-face
features:
- name: celeb_name
dtype: string
- name: image_name
dtype: string
- name: image
dtype: image
- name: average_feature
dtype: sequence
splits:
- name: train
num_bytes: 112545106.0
num_examples: 30000
configs:
- config_name: imdb-face
data_files:
- split: train
path: celeb_average_*
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_rte_adj_postfix | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 782841
num_examples: 2110
- name: train
num_bytes: 679951
num_examples: 1764
download_size: 950669
dataset_size: 1462792
---
# Dataset Card for "MULTI_VALUE_rte_adj_postfix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/harusame_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of harusame/春雨 (Kantai Collection)
This is the dataset of harusame/春雨 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `pink_hair, side_ponytail, long_hair, blue_hair, gradient_hair, multicolored_hair, hat, red_eyes, hair_ornament, beret, ribbon, pink_eyes, hair_flower, hair_ribbon, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 548.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harusame_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 338.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harusame_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1210 | 733.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harusame_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 490.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harusame_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1210 | 991.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harusame_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/harusame_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 41 |  |  |  |  |  | enmaided, maid_headdress, red_neckerchief, 1girl, solo, black_dress, looking_at_viewer, white_apron, long_sleeves, adapted_costume, blush, hair_flaps, frilled_apron, maid_apron, flower, smile, open_mouth, white_background, heart, simple_background |
| 1 | 7 |  |  |  |  |  | 1girl, enmaided, looking_at_viewer, maid_headdress, solo, valentine, blush, heart-shaped_chocolate, maid_apron, long_sleeves, black_pantyhose, frills, smile |
| 2 | 7 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, pleated_skirt, red_neckerchief, solo, cowboy_shot, looking_at_viewer, simple_background, white_background, white_headwear, white_sailor_collar, flower, black_gloves, black_shirt, heart, one-hour_drawing_challenge, open_mouth, smile, twitter_username |
| 3 | 10 |  |  |  |  |  | 1girl, black_gloves, black_serafuku, black_skirt, blush, pleated_skirt, solo, looking_at_viewer, red_neckerchief, simple_background, white_background, flower, sailor_collar, short_sleeves, open_mouth, smile |
| 4 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, solo, black_gloves, flower, pleated_skirt, open_mouth |
| 5 | 13 |  |  |  |  |  | 1girl, black_serafuku, upper_body, red_neckerchief, simple_background, solo, white_background, white_headwear, looking_at_viewer, blush, flower, short_sleeves, black_gloves, open_mouth, smile, white_sailor_collar |
| 6 | 5 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, black_socks, brown_footwear, loafers, pleated_skirt, red_neckerchief, solo, white_headwear, kneehighs, white_background, black_gloves, flower, full_body, hairclip, looking_at_viewer, sailor_collar, simple_background, smile, standing, blush, machinery |
| 7 | 17 |  |  |  |  |  | 1girl, solo, smile, white_headwear, short_sleeves, looking_at_viewer, flower, pink_dress, black_belt, white_background, blush, open_mouth, simple_background, houndstooth, black_bowtie, black_ribbon, buckle, hair_flaps, official_alternate_costume, puffy_sleeves, white_shirt |
| 8 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, flower, black_bikini, medium_breasts, navel, adapted_costume, cleavage, blush, simple_background, sky, white_background, cloud, collarbone, sailor_bikini, white_headwear |
| 9 | 9 |  |  |  |  |  | 1girl, obi, solo, looking_at_viewer, alternate_costume, blush, floral_print, flower, smile, holding, open_mouth, long_sleeves, night, pink_kimono, wide_sleeves, yukata |
| 10 | 6 |  |  |  |  |  | cowboy_shot, highleg_swimsuit, white_one-piece_swimsuit, 1girl, competition_swimsuit, covered_navel, small_breasts, white_headwear, braid, hair_flaps, hairclip, solo, white_thighhighs, looking_at_viewer, nose_blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | enmaided | maid_headdress | red_neckerchief | 1girl | solo | black_dress | looking_at_viewer | white_apron | long_sleeves | adapted_costume | blush | hair_flaps | frilled_apron | maid_apron | flower | smile | open_mouth | white_background | heart | simple_background | valentine | heart-shaped_chocolate | black_pantyhose | frills | black_serafuku | black_skirt | pleated_skirt | cowboy_shot | white_headwear | white_sailor_collar | black_gloves | black_shirt | one-hour_drawing_challenge | twitter_username | sailor_collar | short_sleeves | serafuku | upper_body | black_socks | brown_footwear | loafers | kneehighs | full_body | hairclip | standing | machinery | pink_dress | black_belt | houndstooth | black_bowtie | black_ribbon | buckle | official_alternate_costume | puffy_sleeves | white_shirt | black_bikini | medium_breasts | navel | cleavage | sky | cloud | collarbone | sailor_bikini | obi | alternate_costume | floral_print | holding | night | pink_kimono | wide_sleeves | yukata | highleg_swimsuit | white_one-piece_swimsuit | competition_swimsuit | covered_navel | small_breasts | braid | white_thighhighs | nose_blush |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------|:-----------------|:------------------|:--------|:-------|:--------------|:--------------------|:--------------|:---------------|:------------------|:--------|:-------------|:----------------|:-------------|:---------|:--------|:-------------|:-------------------|:--------|:--------------------|:------------|:-------------------------|:------------------|:---------|:-----------------|:--------------|:----------------|:--------------|:-----------------|:----------------------|:---------------|:--------------|:-----------------------------|:-------------------|:----------------|:----------------|:-----------|:-------------|:--------------|:-----------------|:----------|:------------|:------------|:-----------|:-----------|:------------|:-------------|:-------------|:--------------|:---------------|:---------------|:---------|:-----------------------------|:----------------|:--------------|:---------------|:-----------------|:--------|:-----------|:------|:--------|:-------------|:----------------|:------|:--------------------|:---------------|:----------|:--------|:--------------|:---------------|:---------|:-------------------|:---------------------------|:-----------------------|:----------------|:----------------|:--------|:-------------------|:-------------|
| 0 | 41 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | | X | | X | | X | | | X | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | | X | X | X | | X | | | | | | | | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | | | X | X | X | | X | | | | X | | | | X | X | X | X | | X | | | | | X | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 18 |  |  |  |  |  | | | | X | X | | X | | | | | | | | X | | X | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | | | X | X | X | | X | | | | X | | | | X | X | X | X | | X | | | | | X | | | | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | | X | X | X | | X | | | | X | | | | X | X | | X | | X | | | | | X | X | X | | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 17 |  |  |  |  |  | | | | X | X | | X | | | | X | X | | | X | X | X | X | | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 21 |  |  |  |  |  | | | | X | X | | X | | | X | X | | | | X | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | | | | X | X | | X | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | |
| 10 | 6 |  |  |  |  |  | | | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
paralleldynamix/autotrain-data-face-swap-video-generation | ---
license: bsd
task_categories:
- feature-extraction
- conversational
language:
- en
pretty_name: face swap image to video
size_categories:
- 1K<n<10K
--- |
fast-flash/fast-flash-hackernews-posts | ---
license: apache-2.0
tags:
- hackernews
- text
- social
- nlp
size_categories:
- 10M<n<100M
language:
- en
pretty_name: Fast Flash | HackerNews Posts
task_categories:
- text-classification
- text-generation
- conversational
---
# Fast Flash | HackerNews Posts Dataset
### Exploratory Analysis
Take a look at some fascinating findings from this dataset [on our website](http://wearefastflash.com/blog/hackernews).
### Dataset Summary
We release dataset of all HackerNews posts.
The dataset includes 35,316,999 posts and was collected in March 2023.
You can also find a dataset of all users [right here](https://huggingface.co/datasets/fast-flash/fast-flash-hackernews-users).
### Dataset Structure
The post objects in this dataset are structured according to HackerNews' [API specification](https://github.com/HackerNews/API).
## About the Author
[Fast Flash](https://wearefastflash.com) is a multidisciplinary creative studio that specializes in data-driven development, product design, branding, and tech.
Need help with design, coding, machine learning, pitch decks, data, or analytics?
Drop us a line at [hi@wearefastflash.com](mailto:hi@wearefastflash.com). |
ruanchaves/assin2_por_Latn_to_eng_Latn | ---
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 802897
num_examples: 6500
- name: test
num_bytes: 313661
num_examples: 2448
- name: validation
num_bytes: 62531
num_examples: 500
download_size: 0
dataset_size: 1179089
---
# Dataset Card for "assin2_por_Latn_to_eng_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train50_eval20_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10984
num_examples: 120
- name: validation
num_bytes: 2298
num_examples: 20
download_size: 9499
dataset_size: 13282
---
# Dataset Card for "random_letter_find_passage_train50_eval20_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad | ---
pretty_name: Evaluation run of concedo/Pythia-70M-ChatSalad
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [concedo/Pythia-70M-ChatSalad](https://huggingface.co/concedo/Pythia-70M-ChatSalad)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:59:13.355253](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad/blob/main/results_2023-09-22T19-59-13.355253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902982634,\n \"f1\": 0.008363045302013424,\n\
\ \"f1_stderr\": 0.0006175853648384896,\n \"acc\": 0.26203630623520124,\n\
\ \"acc_stderr\": 0.0070180948326975644\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982634,\n\
\ \"f1\": 0.008363045302013424,\n \"f1_stderr\": 0.0006175853648384896\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5240726124704025,\n\
\ \"acc_stderr\": 0.014036189665395129\n }\n}\n```"
repo_url: https://huggingface.co/concedo/Pythia-70M-ChatSalad
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|drop|3_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-59-13.355253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-59-13.355253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:36:47.045814.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:36:47.045814.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_59_13.355253
path:
- '**/details_harness|winogrande|5_2023-09-22T19-59-13.355253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-59-13.355253.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_36_47.045814
path:
- results_2023-07-19T13:36:47.045814.parquet
- split: 2023_09_22T19_59_13.355253
path:
- results_2023-09-22T19-59-13.355253.parquet
- split: latest
path:
- results_2023-09-22T19-59-13.355253.parquet
---
# Dataset Card for Evaluation run of concedo/Pythia-70M-ChatSalad
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/concedo/Pythia-70M-ChatSalad
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [concedo/Pythia-70M-ChatSalad](https://huggingface.co/concedo/Pythia-70M-ChatSalad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:59:13.355253](https://huggingface.co/datasets/open-llm-leaderboard/details_concedo__Pythia-70M-ChatSalad/blob/main/results_2023-09-22T19-59-13.355253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982634,
"f1": 0.008363045302013424,
"f1_stderr": 0.0006175853648384896,
"acc": 0.26203630623520124,
"acc_stderr": 0.0070180948326975644
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982634,
"f1": 0.008363045302013424,
"f1_stderr": 0.0006175853648384896
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5240726124704025,
"acc_stderr": 0.014036189665395129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nlpaueb/multi_eurlex | ---
pretty_name: Non-Parallel MultiEURLEX (incl. Translations)
annotations_creators:
- found
language_creators:
- found
- machine-generated
language:
- en
- de
- fr
- el
- sk
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|multi_eurlex
task_categories:
- text-classification
task_ids:
- multi-label-classification
- topic-classification
---
# Dataset Card for "Non-Parallel MultiEURLEX (incl. Translations)"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/nlpaueb/multi-eurlex/tree/realistic-zero-shot
- **Repository:** https://github.com/nlpaueb/multi-eurlex/tree/realistic-zero-shot
- **Paper:** TBA
- **Leaderboard:** N/A
- **Point of Contact:** [Ilias Chalkidis](mailto:ilias.chalkidis@di.ku.dk)
### Dataset Summary
**Documents**
MultiEURLEX of Chalkidis et al. (2021) comprises 65k EU laws in 23 official EU languages. Each EU law has been annotated with EUROVOC concepts (labels) by the Publication Office of EU. Each EUROVOC label ID is associated with a *label descriptor*, e.g., [60, agri-foodstuffs], [6006, plant product], [1115, fruit]. The descriptors are also available in the 23 languages. Chalkidis et al. (2019) published a monolingual (English) version of this dataset, called EUR-LEX, comprising 57k EU laws with the originally assigned gold labels.
In this new version, dubbed "Non-Parallel MultiEURLEX (incl. Translations)", MultiEURLEX comprises non-parallel documents across 5 languages (English, German, French, Greek, and Slovak), i.e., 11,000 different documents per language, including also translations from English to the rest of the 4 available languages.
### Supported Tasks and Leaderboards
MultiEURLEX can be used for legal topic classification, a multi-label classification task where legal documents need to be assigned concepts (in our case, from EUROVOC) reflecting their topics. Unlike EUR-LEX, however, MultiEURLEX supports labels from three different granularities (EUROVOC levels). More importantly, apart from monolingual (*one-to-one*) experiments, it can be used to study cross-lingual transfer scenarios, including *one-to-many* (systems trained in one language and used in other languages with no training data), and *many-to-one* or *many-to-many* (systems jointly trained in multiple languages and used in one or more other languages).
The dataset is not yet part of an established benchmark.
### Languages
The EU has 24 official languages. When new members join the EU, the set of official languages usually expands, except the languages are already included. MultiEURLEX covers 23 languages from seven language families (Germanic, Romance, Slavic, Uralic, Baltic, Semitic, Hellenic). EU laws are published in all official languages, except Irish, for resource-related reasons (Read more at https://europa.eu/european-union/about-eu/eu-languages_en). This wide coverage makes MultiEURLEX a valuable testbed for cross-lingual transfer. All languages use the Latin script, except for Bulgarian (Cyrillic script) and Greek. Several other languages are also spoken in EU countries. The EU is home to over 60 additional indigenous regional or minority languages, e.g., Basque, Catalan, Frisian, Saami, and Yiddish, among others, spoken by approx. 40 million people, but these additional languages are not considered official (in terms of EU), and EU laws are not translated to them.
This version of MultiEURLEX covers 5 EU languages (English, German, French, Greek, and Slovak). It also includes machine-translated versions of the documents using the EasyNMT framework (https://github.com/UKPLab/EasyNMT) utilizing the many-to-many M2M_100_418M model of Fan et al. (2020) for el-to-en and el-to-de pairs and the OPUS-MT (Tiedemann et al., 2020) models for the rest.
## Dataset Structure
### Data Instances
**Multilingual use of the dataset**
When the dataset is used in a multilingual setting selecting the the 'all_languages' flag:
```python
from datasets import load_dataset
dataset = load_dataset('nlpaueb/multi_eurlex', 'all_languages')
```
```json
{
"celex_id": "31979D0509",
"text": {"en": "COUNCIL DECISION of 24 May 1979 on financial aid from the Community for the eradication of African swine fever in Spain (79/509/EEC)\nTHE COUNCIL OF THE EUROPEAN COMMUNITIES\nHaving regard to the Treaty establishing the European Economic Community, and in particular Article 43 thereof,\nHaving regard to the proposal from the Commission (1),\nHaving regard to the opinion of the European Parliament (2),\nWhereas the Community should take all appropriate measures to protect itself against the appearance of African swine fever on its territory;\nWhereas to this end the Community has undertaken, and continues to undertake, action designed to contain outbreaks of this type of disease far from its frontiers by helping countries affected to reinforce their preventive measures ; whereas for this purpose Community subsidies have already been granted to Spain;\nWhereas these measures have unquestionably made an effective contribution to the protection of Community livestock, especially through the creation and maintenance of a buffer zone north of the river Ebro;\nWhereas, however, in the opinion of the Spanish authorities themselves, the measures so far implemented must be reinforced if the fundamental objective of eradicating the disease from the entire country is to be achieved;\nWhereas the Spanish authorities have asked the Community to contribute to the expenses necessary for the efficient implementation of a total eradication programme;\nWhereas a favourable response should be given to this request by granting aid to Spain, having regard to the undertaking given by that country to protect the Community against African swine fever and to eliminate completely this disease by the end of a five-year eradication plan;\nWhereas this eradication plan must include certain measures which guarantee the effectiveness of the action taken, and it must be possible to adapt these measures to developments in the situation by means of a procedure establishing close cooperation between the Member States and the Commission;\nWhereas it is necessary to keep the Member States regularly informed as to the progress of the action undertaken,",
"en2fr": "DU CONSEIL du 24 mai 1979 concernant l'aide financiere de la Communaute e l'eradication de la peste porcine africaine en Espagne (79/509/CEE)\nLE CONSEIL DES COMMUNAUTAS EUROPENNES ...",
"en2de": "...",
"en2el": "...",
"en2sk": "..."
},
"labels": [
1,
13,
47
]
}
```
**Monolingual use of the dataset**
When the dataset is used in a monolingual setting selecting the ISO language code for one of the 5 supported languages, or supported translation pairs in the form src2trg, where src and trg are ISO language codes, e.g., en2fr for English translated to French. For example:
```python
from datasets import load_dataset
dataset = load_dataset('nlpaueb/multi_eurlex', 'en2fr')
```
```json
{
"celex_id": "31979D0509",
"text": "DU CONSEIL du 24 mai 1979 concernant l'aide financiere de la Communaute e l'eradication de la peste porcine africaine en Espagne (79/509/CEE)\nLE CONSEIL DES COMMUNAUTAS EUROPENNES ...",
"labels": [
1,
13,
47
]
}
```
### Data Fields
**Multilingual use of the dataset**
The following data fields are provided for documents (`train`, `dev`, `test`):
`celex_id`: (**str**) The official ID of the document. The CELEX number is the unique identifier for all publications in both Eur-Lex and CELLAR.\
`text`: (dict[**str**]) A dictionary with the 23 languages as keys and the full content of each document as values.\
`labels`: (**List[int]**) The relevant EUROVOC concepts (labels).
**Monolingual use of the dataset**
The following data fields are provided for documents (`train`, `dev`, `test`):
`celex_id`: (**str**) The official ID of the document. The CELEX number is the unique identifier for all publications in both Eur-Lex and CELLAR.\
`text`: (**str**) The full content of each document across languages.\
`labels`: (**List[int]**) The relevant EUROVOC concepts (labels).
If you want to use the descriptors of the EUROVOC concepts, similar to [Chalkidis et al. (2020)](https://aclanthology.org/2020.emnlp-main.607/), please download the relevant JSON file [here](https://raw.githubusercontent.com/nlpaueb/multi-eurlex/master/data/eurovoc_descriptors.json).
Then you may load it and use it:
```python
import json
from datasets import load_dataset
# Load the English part of the dataset
dataset = load_dataset('nlpaueb/multi_eurlex', 'en', split='train')
# Load (label_id, descriptor) mapping
with open('./eurovoc_descriptors.json') as jsonl_file:
eurovoc_concepts = json.load(jsonl_file)
# Get feature map info
classlabel = dataset.features["labels"].feature
# Retrieve IDs and descriptors from dataset
for sample in dataset:
print(f'DOCUMENT: {sample["celex_id"]}')
# DOCUMENT: 32006D0213
for label_id in sample['labels']:
print(f'LABEL: id:{label_id}, eurovoc_id: {classlabel.int2str(label_id)}, \
eurovoc_desc:{eurovoc_concepts[classlabel.int2str(label_id)]}')
# LABEL: id: 1, eurovoc_id: '100160', eurovoc_desc: 'industry'
```
### Data Splits
<table>
<tr><td> Language </td> <td> ISO code </td> <td> Member Countries where official </td> <td> EU Speakers [1] </td> <td> Number of Documents [2] </td> </tr>
<tr><td> English </td> <td> <b>en</b> </td> <td> United Kingdom (1973-2020), Ireland (1973), Malta (2004) </td> <td> 13/ 51% </td> <td> 11,000 / 1,000 / 5,000 </td> </tr>
<tr><td> German </td> <td> <b>de</b> </td> <td> Germany (1958), Belgium (1958), Luxembourg (1958) </td> <td> 16/32% </td> <td> 11,000 / 1,000 / 5,000 </td> </tr>
<tr><td> French </td> <td> <b>fr</b> </td> <td> France (1958), Belgium(1958), Luxembourg (1958) </td> <td> 12/26% </td> <td> 11,000 / 1,000 / 5,000 </td> </tr>
<tr><td> Greek </td> <td> <b>el</b> </td> <td> Greece (1981), Cyprus (2008) </td> <td> 3/4% </td> <td> 11,000 / 1,000 / 5,000 </td> </tr>
<tr><td> Slovak </td> <td> <b>sk</b> </td> <td> Slovakia (2004) </td> <td> 1/1% </td> <td> 11,000 / 1,000 / 5,000 </td> </tr>
</table>
[1] Native and Total EU speakers percentage (%) \
[2] Training / Development / Test Splits
## Dataset Creation
### Curation Rationale
The original dataset was curated by Chalkidis et al. (2021).\
The new version of the dataset was curated by Xenouleas et al. (2022).\
The documents have been annotated by the Publications Office of EU (https://publications.europa.eu/en).
### Source Data
#### Initial Data Collection and Normalization
The original data are available at the EUR-LEX portal (https://eur-lex.europa.eu) in unprocessed formats (HTML, XML, RDF). The documents were downloaded from the EUR-LEX portal in HTML. The relevant EUROVOC concepts were downloaded from the SPARQL endpoint of the Publications Office of EU (http://publications.europa.eu/webapi/rdf/sparql).
Chalkidis et al. (2021) stripped HTML mark-up to provide the documents in plain text format and inferred the labels for EUROVOC levels 1--3, by backtracking the EUROVOC hierarchy branches, from the originally assigned labels to their ancestors in levels 1--3, respectively.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
All the documents of the dataset have been annotated by the Publications Office of EU (https://publications.europa.eu/en) with multiple concepts from EUROVOC (http://eurovoc.europa.eu/). EUROVOC has eight levels of concepts. Each document is assigned one or more concepts (labels). If a document is assigned a concept, the ancestors and descendants of that concept are typically not assigned to the same document. The documents were originally annotated with concepts from levels 3 to 8.
Chalkidis et al. (2021)augmented the annotation with three alternative sets of labels per document, replacing each assigned concept by its ancestor from level 1, 2, or 3, respectively.
Thus, Chalkidis et al. (2021) provide four sets of gold labels per document, one for each of the first three levels of the hierarchy, plus the original sparse label assignment.Levels 4 to 8 cannot be used independently, as many documents have gold concepts from the third level; thus many documents will be mislabeled, if we discard level 3.
#### Who are the annotators?
Publications Office of EU (https://publications.europa.eu/en)
### Personal and Sensitive Information
The dataset contains publicly available EU laws that do not include personal or sensitive information with the exception of trivial information presented by consent, e.g., the names of the current presidents of the European Parliament and European Council, and other administration bodies.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Xenouleas et al. (2021)
### Licensing Information
We provide MultiEURLEX with the same licensing as the original EU data (CC-BY-4.0):
© European Union, 1998-2021
The Commission’s document reuse policy is based on Decision 2011/833/EU. Unless otherwise specified, you can re-use the legal documents published in EUR-Lex for commercial or non-commercial purposes.
The copyright for the editorial content of this website, the summaries of EU legislation and the consolidated texts, which is owned by the EU, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://eur-lex.europa.eu/content/legal-notice/legal-notice.html \
Read more: https://eur-lex.europa.eu/content/help/faq/reuse-contents-eurlex.html
### Citation Information
*Stratos Xenouleas, Alexia Tsoukara, Giannis Panagiotakis Ilias Chalkidis, and Ion Androutsopoulos.*
*Realistic Zero-Shot Cross-Lingual Transfer in Legal Topic Classification.*
*Proceedings of 12th Hellenic Conference on Artificial Intelligence (SETN 2022). Corfu, Greece. 2022*
```
@InProceedings{xenouleas-etal-2022-realistic-multieurlex,
author = {Xenouleas, Stratos
and Tsoukara, Alexia
and Panagiotakis, Giannis
and Chalkidis, Ilias
and Androutsopoulos, Ion},
title = {Realistic Zero-Shot Cross-Lingual Transfer in Legal Topic Classification},
booktitle = {Proceedings of 12th Hellenic Conference on Artificial Intelligence (SETN 2022)},
year = {2022},
publisher = {Association for Computer Machinery},
location = {Corfu, Greece},
}
```
### Contributions
Thanks to [@iliaschalkidis](https://github.com/iliaschalkidis) for adding this dataset. |
lecslab/ddo-igt | ---
dataset_info:
features:
- name: glottocode
dtype: string
- name: metalang_glottocode
dtype: string
- name: is_segmented
dtype: string
- name: source
dtype: string
- name: id
dtype: string
- name: transcription
dtype: string
- name: glosses
dtype: string
- name: translation
dtype: string
- name: pos_glosses
dtype: string
splits:
- name: train
num_bytes: 3628449
num_examples: 7116
- name: eval
num_bytes: 460002
num_examples: 890
- name: test
num_bytes: 449100
num_examples: 890
download_size: 1271363
dataset_size: 4537551
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
huggingartists/the-weeknd | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/the-weeknd"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.849373 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/f0813e600d43b8b43c94e8ba1dde880a.640x640x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/the-weeknd">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Weeknd</div>
<a href="https://genius.com/artists/the-weeknd">
<div style="text-align: center; font-size: 14px;">@the-weeknd</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/the-weeknd).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-weeknd")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|TRAIN_1.849373| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/the-weeknd")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
|
DeepFoldProtein/CATH_v4.3_S35_processed_512_test | ---
dataset_info:
features:
- name: index
dtype: string
- name: ndom
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: domain_labels
sequence:
sequence: int64
- name: label
sequence:
sequence: float64
splits:
- name: train
num_bytes: 195469165
num_examples: 93
download_size: 38878
dataset_size: 195469165
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ntmkhanh/recipe | ---
license: apache-2.0
---
|
nielsr/datacomp-small-filtered | ---
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: clip_l14_embedding
sequence: float64
splits:
- name: train
num_bytes: 22741192622.844204
num_examples: 3521953
download_size: 6371420677
dataset_size: 22741192622.844204
---
# Dataset Card for "datacomp-small-filtered"
This is the [DataComp-small]() dataset with CLIP-large-patch14 image embeddings added, as well as:
- captions filtered for English using a FastText model
- captions filtered to have at least complexity of 1 |
senhorsapo/simon | ---
license: openrail
---
|
Sunny5555/My_First_Dataset | ---
license: afl-3.0
---
|
ericbalfour1977/eric-balfour | ---
license: openrail
---
|
nataliaElv/oasst_response_comparison | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for oasst_response_comparison
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/oasst_response_comparison")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/oasst_response_comparison")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | text | True | True |
| reply_1 | Reply 1 | text | True | True |
| reply_2 | Reply 2 | text | True | True |
| reply_3 | Reply 3 | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| reply-ranking | Order replies based on your preference | ranking | True | Order the replies from 1 to 3 based on your preference. | ['reply-1', 'reply-2', 'reply-3'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "Can you write a short introduction about the relevance of the term \"monopsony\" in economics? Please use examples related to potential monopsonies in the labour market and cite relevant research.",
"reply_1": "\"Monopsony\" refers to a market structure where there is only one buyer for a particular good or service. In economics, this term is particularly relevant in the labor market, where a monopsony employer has significant power over the wages and working conditions of their employees. The presence of a monopsony can result in lower wages and reduced employment opportunities for workers, as the employer has little incentive to increase wages or provide better working conditions.\n\nRecent research has identified potential monopsonies in industries such as retail and fast food, where a few large companies control a significant portion of the market (Bivens \u0026 Mishel, 2013). In these industries, workers often face low wages, limited benefits, and reduced bargaining power, leading to a situation where they are dependent on the employer for their livelihood. This dependence can result in further suppression of wages and a decline in working conditions.\n\nOverall, the concept of monopsony is essential to understanding the dynamics of labor markets and the impact of market power on workers. Further research is needed to understand the extent and impact of monopsonies on the economy and to develop policies to address this issue.\n\nReferences:\nBivens, J., \u0026 Mishel, L. (2013). The Pay of Corporate Executives and Financial Professionals as Evidence of Rents in Top 1 Percent Incomes. Journal of Economic Perspectives, 27(3), 57-78.",
"reply_2": "Monopsony is a market structure in which there is a single buyer in a market. In the context of labor markets, a monopsony occurs when there is only one employer in a particular industry or geographic area, giving that employer significant power over the wages and working conditions of workers. This results in a situation where the employer has the ability to suppress wages and reduce the bargaining power of workers.\n\nResearch has shown that the existence of monopsony power in labor markets can have negative effects on workers, such as lower wages, reduced benefits, and limited job opportunities. For example, a study by Autor, Dorn, and Hanson (2013) found evidence of monopsony power in certain U.S. labor markets, resulting in lower wages and reduced job opportunities for workers. Similarly, another study by Bharadwaj, Black, and Kolesnikova (2019) found that the presence of a single dominant employer in a local labor market was associated with lower wages for workers.",
"reply_3": "Monopsony refers to a market structure where there is only one buyer of a good or service. In the context of labor markets, a monopsony occurs when there is only one employer in a particular geographic area or industry, giving them significant power to set wages and working conditions. Research has shown that the existence of monopsonies can lead to lower wages and reduced job opportunities for workers. For example, a study by the National Bureau of Economic Research found that in industries with high levels of concentration, workers earn lower wages and are less likely to receive benefits such as health insurance."
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"prompt": "Can you write a short introduction about the relevance of the term \"monopsony\" in economics? Please use examples related to potential monopsonies in the labour market and cite relevant research.",
"reply-ranking": [],
"reply-ranking-suggestion": null,
"reply-ranking-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"reply_1": "\"Monopsony\" refers to a market structure where there is only one buyer for a particular good or service. In economics, this term is particularly relevant in the labor market, where a monopsony employer has significant power over the wages and working conditions of their employees. The presence of a monopsony can result in lower wages and reduced employment opportunities for workers, as the employer has little incentive to increase wages or provide better working conditions.\n\nRecent research has identified potential monopsonies in industries such as retail and fast food, where a few large companies control a significant portion of the market (Bivens \u0026 Mishel, 2013). In these industries, workers often face low wages, limited benefits, and reduced bargaining power, leading to a situation where they are dependent on the employer for their livelihood. This dependence can result in further suppression of wages and a decline in working conditions.\n\nOverall, the concept of monopsony is essential to understanding the dynamics of labor markets and the impact of market power on workers. Further research is needed to understand the extent and impact of monopsonies on the economy and to develop policies to address this issue.\n\nReferences:\nBivens, J., \u0026 Mishel, L. (2013). The Pay of Corporate Executives and Financial Professionals as Evidence of Rents in Top 1 Percent Incomes. Journal of Economic Perspectives, 27(3), 57-78.",
"reply_2": "Monopsony is a market structure in which there is a single buyer in a market. In the context of labor markets, a monopsony occurs when there is only one employer in a particular industry or geographic area, giving that employer significant power over the wages and working conditions of workers. This results in a situation where the employer has the ability to suppress wages and reduce the bargaining power of workers.\n\nResearch has shown that the existence of monopsony power in labor markets can have negative effects on workers, such as lower wages, reduced benefits, and limited job opportunities. For example, a study by Autor, Dorn, and Hanson (2013) found evidence of monopsony power in certain U.S. labor markets, resulting in lower wages and reduced job opportunities for workers. Similarly, another study by Bharadwaj, Black, and Kolesnikova (2019) found that the presence of a single dominant employer in a local labor market was associated with lower wages for workers.",
"reply_3": "Monopsony refers to a market structure where there is only one buyer of a good or service. In the context of labor markets, a monopsony occurs when there is only one employer in a particular geographic area or industry, giving them significant power to set wages and working conditions. Research has shown that the existence of monopsonies can lead to lower wages and reduced job opportunities for workers. For example, a study by the National Bureau of Economic Research found that in industries with high levels of concentration, workers earn lower wages and are less likely to receive benefits such as health insurance."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* **reply_1** is of type `text`.
* **reply_2** is of type `text`.
* **reply_3** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **reply-ranking** is of type `ranking` with the following allowed values ['reply-1', 'reply-2', 'reply-3'], and description "Order the replies from 1 to 3 based on your preference.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **reply-ranking-suggestion** is of type `ranking` with the following allowed values ['reply-1', 'reply-2', 'reply-3'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
For each promt, order the replies in a ranking based on how clear and helpful you find each reply. Ties are allowed. If you prefer not to give an answer, click Discard and move on to the next record.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zan/lima-ja | ---
license: cc-by-nc-sa-4.0
language:
- ja
---
# Dataset Card for LIMA-JA
## Dataset Description
This is Japanese LIMA dataset, which is translated from the [LIMA dataset](https://huggingface.co/datasets/GAIR/lima) that Meta's [LIMA model](https://arxiv.org/abs/2305.11206) (Zhou et al., 2023) was trained on.
In this process, after the Japanese translation is performed by ChatGPT, I am making some minor adjustments.
## Usage
```python
>>> from datasets import load_dataset
>>> dataset = load_dataset('zan/lima-ja')
>>> # dataset = load_dataset('zan/lima-ja', "v1")
```
```python
>>> dataset["train"][0]
{'conversations': ['脳細胞は移動することができるのでしょうか? 移動とは、遠距離の移動(可能ならば脳内のみで)を指します。',
'この質問は比較的広範であり、脳は単にニューロンだけでなく、グリア細胞(支持細胞)や前ミトーシスの神経幹細胞も含まれることを考慮する必要があります。さらに、重要な同僚科学者たちが指摘しているように、発生段階も非常に重要であり、発達中の胎児の脳は成人の脳とは非常に異なります。\n\nしかし、さまざまな出版物を精査した結果、この質問の答えは実際には非常に単純であることがわかりました。結論として、脳細胞は移動します。\n\n成人の脳では、グリア細胞が脳内を移動します(Klämbt、2009年)。グリア細胞は多くの機能に関与していますが、移動するグリア細胞の注目すべき例は、標的の軸索を見つけるために比較的長距離を移動し、それに巻きついて絶縁性のミエリン鞘を形成するオリゴデンドロサイトです(TsaiとMiller、2002年)。\n\n神経幹細胞は損傷に対応して長距離を移動します(Imitola et al.、2004年)し、特定の幹細胞の位置(例えば、海馬や脳室下帯)から他の領域に移動します(Clarke、2003年)。\n\nミトーシス後でありながら未分化のニューロンは、魚(Scott et al.、2012年)や哺乳類、および非人猿でも成人の脳内を移動することが示されています(Sawada et al.、2011年)。\n\n驚くことではありませんが、グリア細胞、幹細胞、およびニューロンは胎児の発生中も移動します。特に、末梢機能を果たすために運命づけられた分裂後のニューロンは、神経堤から標的の位置まで比較的長い距離を移動しなければなりません(Neuroscience、第2版、Neuronal Migration)。'],
'source': 'stackexchange'}
```
## Version Description
## v1
A version that has been modified by adding about 100 changes after being translated by ChatGPT.
## v2
more modified version
(Coming soon...)
## License
If the source data of LIMA has a stricter license than CC BY-NC-SA, the LIMA dataset follows the same. Otherwise, it follows the CC BY-NC-SA license.
## Citation Information
```
@InProceedings{huggingface:dataset,
title = {LIMA-JA: Japanese LIMA Dataset for Efficient Instruction-tuning},
author = {zan},
year = {2023}
}
``` |
mdass/gpt_gen_desc_logos_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: description
dtype: string
splits:
- name: train
num_bytes: 2028708.0
num_examples: 100
download_size: 2003239
dataset_size: 2028708.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-21000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1095675
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/mr-tydi_fi_dev | ---
pretty_name: '`mr-tydi/fi/dev`'
viewer: false
source_datasets: ['irds/mr-tydi_fi']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/fi/dev`
The `mr-tydi/fi/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/fi/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,738
- `qrels`: (relevance assessments); count=1,738
- For `docs`, use [`irds/mr-tydi_fi`](https://huggingface.co/datasets/irds/mr-tydi_fi)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_fi_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_fi_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
CyberHarem/mp41_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mp41/MP41/MP41 (Girls' Frontline)
This is the dataset of mp41/MP41/MP41 (Girls' Frontline), containing 47 images and their tags.
The core tags of this character are `blue_eyes, short_hair, hat, hair_ornament, blonde_hair, military_hat, bangs, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 47 | 75.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mp41_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 47 | 35.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mp41_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 121 | 79.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mp41_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 47 | 62.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mp41_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 121 | 120.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mp41_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mp41_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, military_uniform, solo, smile, necktie, boots, wrist_cuffs, submachine_gun, armband, full_body, looking_at_viewer, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | military_uniform | solo | smile | necktie | boots | wrist_cuffs | submachine_gun | armband | full_body | looking_at_viewer | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:--------|:----------|:--------|:--------------|:-----------------|:----------|:------------|:--------------------|:----------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
sukantan/nyaya-ae-all-MiniLM-L6-v2-ftlegal-v2 | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
splits:
- name: train
num_bytes: 82618368
num_examples: 53788
download_size: 99745430
dataset_size: 82618368
---
# Dataset Card for "nyaya-ae-all-MiniLM-L6-v2-ftlegal-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charliecb/levenshtein_filtering | ---
license: openrail
---
|
AdapterOcean/physics_dataset_standardized_cluster_2_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17146201
num_examples: 11144
download_size: 0
dataset_size: 17146201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_2_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Srikar6476/indian_food | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1586188582.7514334
num_examples: 5328
- name: test
num_bytes: 223888796.3925666
num_examples: 941
download_size: 1601964166
dataset_size: 1810077379.144
---
# Dataset Card for "indian_food"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713098736 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 71892
num_examples: 180
download_size: 44027
dataset_size: 71892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kennyloggins346/kennyloggins | ---
license: unknown
---
|
ibranze/araproje_arc_en_w1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 79944.54545454546
num_examples: 250
download_size: 0
dataset_size: 79944.54545454546
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_w1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/downes_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of downes/ダウンズ/唐斯 (Azur Lane)
This is the dataset of downes/ダウンズ/唐斯 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `long_hair, red_eyes, white_hair, bandaid_on_face, hair_ornament, mole, ahoge, hairclip, heterochromia, mole_under_eye, yellow_eyes, earrings, low_ponytail, breasts, multicolored_hair, bangs, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 27.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/downes_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 18.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/downes_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 64 | 37.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/downes_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 25.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/downes_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 64 | 48.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/downes_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/downes_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, bandaid_on_nose, solo, smile, jewelry, looking_at_viewer, fingerless_gloves, open_mouth, black_gloves, collarbone, thighhighs, white_shirt, bare_shoulders, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandaid_on_nose | solo | smile | jewelry | looking_at_viewer | fingerless_gloves | open_mouth | black_gloves | collarbone | thighhighs | white_shirt | bare_shoulders | blush |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:-------|:--------|:----------|:--------------------|:--------------------|:-------------|:---------------|:-------------|:-------------|:--------------|:-----------------|:--------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
presencesw/vinli_entailment_test | ---
dataset_info:
features:
- name: gold_label
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 3158
num_examples: 10
download_size: 4449
dataset_size: 3158
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dmrau/cqadupstack-wordpress | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 35736
num_examples: 541
- name: corpus
num_bytes: 53026140
num_examples: 48605
download_size: 26551471
dataset_size: 53061876
---
# Dataset Card for "cqadupstack-wordpress"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
piazzola/semEval_rebel | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: relation
dtype:
class_label:
names:
'0': Cause-Effect(e1,e2)
'1': Cause-Effect(e2,e1)
'2': Component-Whole(e1,e2)
'3': Component-Whole(e2,e1)
'4': Content-Container(e1,e2)
'5': Content-Container(e2,e1)
'6': Entity-Destination(e1,e2)
'7': Entity-Destination(e2,e1)
'8': Entity-Origin(e1,e2)
'9': Entity-Origin(e2,e1)
'10': Instrument-Agency(e1,e2)
'11': Instrument-Agency(e2,e1)
'12': Member-Collection(e1,e2)
'13': Member-Collection(e2,e1)
'14': Message-Topic(e1,e2)
'15': Message-Topic(e2,e1)
'16': Product-Producer(e1,e2)
'17': Product-Producer(e2,e1)
'18': Other
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: triplets
dtype: string
splits:
- name: train
num_bytes: 2976954
num_examples: 8000
- name: test
num_bytes: 1007749
num_examples: 2717
download_size: 2580448
dataset_size: 3984703
---
# Dataset Card for "semEval_rebel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Juniorgja/nandopinheiro | ---
license: openrail
---
|
316usman/medical | ---
license: bsd
dataset_info:
features:
- name: vector
sequence: float32
- name: metadata
struct:
- name: text
dtype: string
splits:
- name: train
num_bytes: 955472430
num_examples: 158114
download_size: 898315744
dataset_size: 955472430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
trpakov/chest-xray-classification | ---
task_categories:
- image-classification
tags:
- roboflow
- roboflow2huggingface
- Biology
---
<div align="center">
<img width="640" alt="trpakov/chest-xray-classification" src="https://huggingface.co/datasets/trpakov/chest-xray-classification/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['PNEUMONIA', 'NORMAL']
```
### Number of Images
```json
{'test': 582, 'valid': 1165, 'train': 12230}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("trpakov/chest-xray-classification", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/mohamed-traore-2ekkp/chest-x-rays-qjmia/dataset/3](https://universe.roboflow.com/mohamed-traore-2ekkp/chest-x-rays-qjmia/dataset/3?ref=roboflow2huggingface)
### Citation
```
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.ai on December 8, 2021 at 12:45 AM GMT
It includes 13977 images.
Pneumonia are annotated in folder format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
The following augmentation was applied to create 3 versions of each source image:
* Random shear of between -3° to +3° horizontally and -2° to +2° vertically
* Random brigthness adjustment of between -5 and +5 percent
* Random exposure adjustment of between -5 and +5 percent
|
tsetsuuhei/filtered_train_dataset | ---
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: es
dtype: string
splits:
- name: train
num_bytes: 132669938
num_examples: 988660
download_size: 96784983
dataset_size: 132669938
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edumunozsala/preference-hate-speech-es | ---
language:
- en
- es
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 5520144
num_examples: 2937
download_size: 1711074
dataset_size: 5520144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
projecte-aina/tecla | ---
YAML tags:
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ca
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
pretty_name: tecla
size_categories:
- unknown
source_datasets: []
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for TeCla
## Dataset Description
- **Website:** [Zenodo](https://zenodo.org/record/7334110)
- **Point of Contact:** langtech@bsc.es
### Dataset Summary
TeCla (Text Classification) is a Catalan News corpus for thematic multi-class Text Classification tasks. The present version (2.0) contains 113.376 articles classified under a hierarchical class structure consisting of a coarse-grained and a fine-grained class. Each of the 4 coarse-grained classes accept a subset of fine-grained ones, 53 in total.
The previous version (1.0.1) can still be found at https://zenodo.org/record/4761505
This dataset was developed by [BSC TeMU](https://temu.bsc.es/) as part of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina/), to enrich the [Catalan Language Understanding Benchmark (CLUB)](https://club.aina.bsc.es/).
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.
### Supported Tasks and Leaderboards
Text classification, Language Model
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
### Data Instances
Three json files, one for each split.
### Data Fields
Each example contains the following 3 fields:
* text: the article text (string)
* label1: the coarse-grained class
* label2: the fine-grained class
#### Example:
<pre>
{"version": "2.0",
"data":
[
{
'sentence': "La setena edició del Festival Fantàstik inclourà les cintes 'Matar a dios' i 'Mandy' i un homenatge a 'Mi vecino Totoro'. Es projectaran 22 curtmetratges seleccionats d'entre més de 500 presentats a nivell internacional. El Centre Cultural de Granollers acull del 8 a l'11 de novembre la setena edició del Festival Fantàstik. El certamen, que s'allargarà un dia, arrencarà amb la projecció de la cinta de Caye Casas i Albert Pide 'Matar a Dios'. Els dos directors estaran presents en la inauguració de la cita. A més, els asssitents podran gaudir de 'Mandy', el darrer treball de Nicolas Cage. Altres llargmetratges seleccionats per aquest any són 'Aterrados' (2017), 'Revenge' (2017), 'A Mata Negra' (2018), 'Top Knot Detective' (2018) i 'La Gran Desfeta' (2018). A més, amb motiu del trentè aniversari de la pel·lícula 'El meu veí Totoro' es durà a terme l'exposició dedicada a aquest film '30 anys 30 artistes' comissariada per Jordi Pastor i Reinaldo Pereira. La mostra '30 anys 30 artistes' recull els treballs de trenta artistes d'estils diferents al voltant de la figura de Totoro i el seu director. Es podrà veure durant els dies de festival i es complementarà amb la projecció de la pel·lícula el diumenge 11 de novembre. Al llarg del festival també es projectaran els 22 curtmetratges prèviament seleccionats d'entre més de 500 presentats a nivell internacional. El millor tindrà una dotació de 1000 euros fruit de la unió de forces amb el Mercat Audiovisual de Catalunya.",
'label1': 'Cultura',
'label2': 'Cinema'
},
...
]
}
</pre>
#### Labels
* label1: 'Societat', 'Política', 'Economia', 'Cultura'
* label2: 'Llengua', 'Infraestructures', 'Arts', 'Parlament', 'Noves tecnologies', 'Castells', 'Successos', 'Empresa', 'Mobilitat', 'Teatre', 'Treball', 'Logística', 'Urbanisme', 'Govern', 'Entitats', 'Finances', 'Govern espanyol', 'Trànsit', 'Indústria', 'Esports', 'Exteriors', 'Medi ambient', 'Habitatge', 'Salut', 'Equipaments i patrimoni', 'Recerca', 'Cooperació', 'Innovació', 'Agroalimentació', 'Policial', 'Serveis Socials', 'Cinema', 'Memòria històrica', 'Turisme', 'Política municipal', 'Comerç', 'Universitats', 'Hisenda', 'Judicial', 'Partits', 'Música', 'Lletres', 'Religió', 'Festa i cultura popular', 'Unió Europea', 'Moda', 'Moviments socials', 'Comptes públics', 'Immigració', 'Educació', 'Gastronomia', 'Meteorologia', 'Energia'
### Data Splits
Train, development and test splits were created in a stratified fashion, following a 0.8, 0.05 and 0.15 proportion, respectively. The sizes of each split are the following:
* train.json: 90700 examples
* dev.json: 5669 examples
* test.json: 17007 examples
## Dataset Creation
### Curation Rationale
We created this dataset to contribute to the development of language models in Catalan, a low-resource language.
### Source Data
#### Initial Data Collection and Normalization
The source data are crawled articles from the Catalan News Agency ([Agència Catalana de Notícies, ACN](https://www.acn.cat/)) site.
We crawled 219.586 articles from the Catalan News Agency ([Agència Catalana de Notícies; ACN](https://www.acn.cat/)) newswire archive, the latest from October 11, 2020.
From the crawled data, we selected those articles whose 'section' and 'subsection' categories followed the expected codification combinations included in the ACN's style guide and whose 'section' complied the requirements of containing subsections and being thematically founded (in contrast to geographically defined categories such as 'Món' and 'Unió Europea'). The articles originally belonging to the 'Unió Europea' section, which were related to political organisms from the European Union, were included in the 'Política' coarse-grained category (within a fine-grained category named 'Unió Europea') due to its close proximity between some of the original subsections of 'Política' and those of 'Unió Europea', both defined by the specific political organism dealt with in the article.
The text field in each example is a concatenation of the original title, subtitle and body of the article (before the concatenation, both title and subtitle were added a final dot whenever they lacked one). The preprocessing of the texts was minimal and consisted in the removal of the pattern "ACN {location}.-" included before the body in each text as well as newlines originally used to divide the text in paragraphs.
#### Who are the source language producers?
The Catalan News Agency ([Agència Catalana de Notícies; ACN](https://www.acn.cat/)) is a news agency owned by the Catalan government via the public corporation Intracatalònia, SA. It is one of the first digital news agencies created in Europe and has been operating since 1999 (source: [wikipedia](https://en.wikipedia.org/wiki/Catalan_News_Agency)).
### Annotations
#### Annotation process
The crawled data contained the categories' annotations, which were then used to create this dataset with the mentioned criteria.
#### Who are the annotators?
Editorial staff classified the articles under the different thematic sections and subsections, and we extracted these from metadata.
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this dataset contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Irene Baucells (irene.baucells@bsc.es), Casimiro Pio Carrino (casimiro.carrino@bsc.es), Carlos Rodríguez (carlos.rodriguez1@bsc.es) and Carme Armentano (carme.armentano@bsc.es), from [BSC-CNS](https://www.bsc.es/).
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/index.html#googtrans(ca|en) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina).
### Licensing Information
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.
### Citation Information
|
open-llm-leaderboard/details_CultriX__NeuralTrix-bf16 | ---
pretty_name: Evaluation run of CultriX/NeuralTrix-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CultriX/NeuralTrix-bf16](https://huggingface.co/CultriX/NeuralTrix-bf16) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__NeuralTrix-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T03:22:10.041782](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__NeuralTrix-bf16/blob/main/results_2024-03-03T03-22-10.041782.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6486765449678997,\n\
\ \"acc_stderr\": 0.03222136547241223,\n \"acc_norm\": 0.6483708505012482,\n\
\ \"acc_norm_stderr\": 0.032892331194695104,\n \"mc1\": 0.6450428396572827,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.7953780824002232,\n\
\ \"mc2_stderr\": 0.013387241984191129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.004495891440519421,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.00310805456335211\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.044631127206771704,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.044631127206771704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579827,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6450428396572827,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.7953780824002232,\n\
\ \"mc2_stderr\": 0.013387241984191129\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750026\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6717210007581501,\n \
\ \"acc_stderr\": 0.012934758019449618\n }\n}\n```"
repo_url: https://huggingface.co/CultriX/NeuralTrix-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|arc:challenge|25_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|gsm8k|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hellaswag|10_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T03-22-10.041782.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T03-22-10.041782.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- '**/details_harness|winogrande|5_2024-03-03T03-22-10.041782.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T03-22-10.041782.parquet'
- config_name: results
data_files:
- split: 2024_03_03T03_22_10.041782
path:
- results_2024-03-03T03-22-10.041782.parquet
- split: latest
path:
- results_2024-03-03T03-22-10.041782.parquet
---
# Dataset Card for Evaluation run of CultriX/NeuralTrix-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CultriX/NeuralTrix-bf16](https://huggingface.co/CultriX/NeuralTrix-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CultriX__NeuralTrix-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T03:22:10.041782](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__NeuralTrix-bf16/blob/main/results_2024-03-03T03-22-10.041782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6486765449678997,
"acc_stderr": 0.03222136547241223,
"acc_norm": 0.6483708505012482,
"acc_norm_stderr": 0.032892331194695104,
"mc1": 0.6450428396572827,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.7953780824002232,
"mc2_stderr": 0.013387241984191129
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.004495891440519421,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.00310805456335211
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.044631127206771704,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.044631127206771704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579827,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6450428396572827,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.7953780824002232,
"mc2_stderr": 0.013387241984191129
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750026
},
"harness|gsm8k|5": {
"acc": 0.6717210007581501,
"acc_stderr": 0.012934758019449618
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dvilasuero/comparison-data-falcon | ---
dataset_info:
features:
- name: instruction
dtype: string
id: field
- name: response-1
dtype: string
id: field
- name: response-2
dtype: string
id: field
- name: choose-best
sequence:
- name: user_id
dtype: string
- name: value
dtype: int32
- name: status
dtype: string
id: question
- name: external_id
dtype: string
id: external_id
splits:
- name: train
num_bytes: 124852
num_examples: 100
download_size: 87155
dataset_size: 124852
---
# Dataset Card for "comparison-data-falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/AA_ApplicationDistilRoBERTa_110K_5_F | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147219352
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_ApplicationDistilRoBERTa_110K_5_F"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Partha117/apache_bugs_with_chunks | ---
dataset_info:
features:
- name: issue_id
dtype: int64
- name: title
dtype: string
- name: body
dtype: string
- name: status
dtype: string
- name: after_fix_sha
dtype: string
- name: project_name
dtype: string
- name: repo_url
dtype: string
- name: repo_name
dtype: string
- name: language
dtype: string
- name: issue_url
dtype: 'null'
- name: before_fix_sha
dtype: 'null'
- name: pull_url
dtype: 'null'
- name: commit_datetime
dtype: timestamp[us, tz=UTC]
- name: report_datetime
dtype: timestamp[us, tz=UTC]
- name: updated_file
dtype: string
- name: chunk_content
dtype: string
splits:
- name: train
num_bytes: 1295665740
num_examples: 638182
download_size: 184713524
dataset_size: 1295665740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_203 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1218716568.0
num_examples: 237474
download_size: 1248956512
dataset_size: 1218716568.0
---
# Dataset Card for "chunk_203"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ColtonDevAcc/ProudctInformation | ---
license: openrail
---
|
helvioviana/Clone | ---
license: openrail
---
|
krvhrv/crawl-books | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8275719285
num_examples: 659893
download_size: 4465548628
dataset_size: 8275719285
---
# Dataset Card for "crawl-books"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/dollyaug-standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 13252818
num_examples: 1327
download_size: 4149402
dataset_size: 13252818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dollyaug-standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-muse256-muse512-wuerst-sdv15/fd9df6ed | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 215
num_examples: 10
download_size: 1393
dataset_size: 215
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fd9df6ed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713000521 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 464221
num_examples: 1253
download_size: 246409
dataset_size: 464221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B | ---
pretty_name: Evaluation run of Weyaxi/Cosmosis-3x34B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Cosmosis-3x34B](https://huggingface.co/Weyaxi/Cosmosis-3x34B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T11:59:17.025888](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B/blob/main/results_2024-01-14T11-59-17.025888.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7691798340940261,\n\
\ \"acc_stderr\": 0.027910883477876437,\n \"acc_norm\": 0.7725855380923361,\n\
\ \"acc_norm_stderr\": 0.02844764712553433,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6382238408380394,\n\
\ \"mc2_stderr\": 0.01475552588950266\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441377,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185347\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n\
\ \"acc_stderr\": 0.004737608340163403,\n \"acc_norm\": 0.851822346146186,\n\
\ \"acc_norm_stderr\": 0.003545499169558051\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n\
\ \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.02389335183446432,\n\
\ \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.02389335183446432\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.02477451625044016,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.02477451625044016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n\
\ \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.701058201058201,\n \"acc_stderr\": 0.023577604791655802,\n \"\
acc_norm\": 0.701058201058201,\n \"acc_norm_stderr\": 0.023577604791655802\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.9064516129032258,\n \"acc_stderr\": 0.016565754668270972,\n\
\ \"acc_norm\": 0.9064516129032258,\n \"acc_norm_stderr\": 0.016565754668270972\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.02602465765165619,\n\
\ \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.02602465765165619\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233335,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233335\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.019457390787681803,\n\
\ \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.019457390787681803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \
\ \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852118,\n\
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852118\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289694,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334879,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334879\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n\
\ \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.026501440784762752,\n\
\ \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.026501440784762752\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n\
\ \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n\
\ \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n\
\ \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7720670391061453,\n\
\ \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.7720670391061453,\n\
\ \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n\
\ \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n\
\ \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n\
\ \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \
\ \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6160365058670143,\n\
\ \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.6160365058670143,\n\
\ \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113018,\n\
\ \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113018\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916635,\n\
\ \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916635\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6382238408380394,\n\
\ \"mc2_stderr\": 0.01475552588950266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.01233344758104755\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Cosmosis-3x34B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|arc:challenge|25_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|gsm8k|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hellaswag|10_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T11-59-17.025888.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- '**/details_harness|winogrande|5_2024-01-14T11-59-17.025888.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T11-59-17.025888.parquet'
- config_name: results
data_files:
- split: 2024_01_14T11_59_17.025888
path:
- results_2024-01-14T11-59-17.025888.parquet
- split: latest
path:
- results_2024-01-14T11-59-17.025888.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Cosmosis-3x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Cosmosis-3x34B](https://huggingface.co/Weyaxi/Cosmosis-3x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T11:59:17.025888](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Cosmosis-3x34B/blob/main/results_2024-01-14T11-59-17.025888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7691798340940261,
"acc_stderr": 0.027910883477876437,
"acc_norm": 0.7725855380923361,
"acc_norm_stderr": 0.02844764712553433,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6382238408380394,
"mc2_stderr": 0.01475552588950266
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441377,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185347
},
"harness|hellaswag|10": {
"acc": 0.6569408484365664,
"acc_stderr": 0.004737608340163403,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.003545499169558051
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.02389335183446432,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.02389335183446432
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.02477451625044016,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.02477451625044016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.701058201058201,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.701058201058201,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.016565754668270972,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.016565754668270972
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.02602465765165619,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.02602465765165619
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233335,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.019457390787681803,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.019457390787681803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852118,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852118
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289694,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334879,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334879
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.026501440784762752,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.026501440784762752
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.01046101533819307,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.01046101533819307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7720670391061453,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.7720670391061453,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539946,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539946
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6160365058670143,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.6160365058670143,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113018,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113018
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6382238408380394,
"mc2_stderr": 0.01475552588950266
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028214
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.01233344758104755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lionelchg/dolly_brainstorming | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1474035.5198187996
num_examples: 1677
- name: test
num_bytes: 78228.48018120046
num_examples: 89
download_size: 977653
dataset_size: 1552264.0
---
# Dataset Card for "dolly_brainstorming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2SampleValidation_google_flan_t5_xxl_mode_VQAv2_visclues_detection_caption_module_ns_1000_OE | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 141448
num_examples: 1000
download_size: 53292
dataset_size: 141448
---
# Dataset Card for "VQAv2SampleValidation_google_flan_t5_xxl_mode_VQAv2_visclues_detection_caption_module_ns_1000_OE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yweslakarep123/finetuned-bert-mrpc | ---
license: apache-2.0
---
|
theonerichy/booru_txt2tag_scoresorted | ---
license: gpl-3.0
---
A dataset to help train models to generate danbooru tags that could match a given image description. Pretty mediocre quality for now, but I don't have a great way of getting more data yet.
The format is the alpaca instruction format, and the output tags are ordered by popularity. |
vmalonso/dataset_rof | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 511071
num_examples: 1668
download_size: 192449
dataset_size: 511071
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-2bf8ffdd-12835704 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: sysresearch101/t5-large-finetuned-xsum
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: train
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sysresearch101/t5-large-finetuned-xsum
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sysresearch101](https://huggingface.co/sysresearch101) for evaluating this model. |
CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mitsuba Mutsumi (Seitokai Yakuindomo)
This is the dataset of Mitsuba Mutsumi (Seitokai Yakuindomo), containing 227 images and their tags.
The core tags of this character are `black_hair, long_hair, ponytail, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 227 | 106.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 227 | 89.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 428 | 158.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 227 | 106.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 428 | 182.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mitsuba_mutsumi_seitokaiyakuindomo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, smile, parody, anime_coloring, school_uniform |
| 1 | 6 |  |  |  |  |  | 1girl, solo, anime_coloring, smile, style_parody, twintails, official_style, school_uniform, bow |
| 2 | 10 |  |  |  |  |  | 1girl, blazer, school_uniform, solo, smile, bow, twintails |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | parody | anime_coloring | school_uniform | style_parody | twintails | official_style | bow | blazer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:-----------------|:-----------------|:---------------|:------------|:-----------------|:------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | |
| 2 | 10 |  |  |  |  |  | X | X | X | | | X | | X | | X | X |
|
tyzhu/squad_qa_wrong_num_v5_full_recite_ans_sent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7801171
num_examples: 5070
- name: validation
num_bytes: 406689
num_examples: 300
download_size: 0
dataset_size: 8207860
---
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giuseppefutia/test-json-rac | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4750696
num_examples: 1825
download_size: 2231250
dataset_size: 4750696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ytzi/the-stack-dedup-python-filtered-dec_gen_async-gpt2 | ---
dataset_info:
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 162183078606
num_examples: 12960052
download_size: 47479427571
dataset_size: 162183078606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CreitinGameplays/elisa-chan-v2 | ---
language:
- en
---
Elisa-chan's dataset, made using ChatGPT
"Elisa-chan, an exuberant 20-year-old Japanese woman chatbot! Whether your conversation partner is a fan of games, anime, or just needs a mood lift, you've got the perfect remedy. Encourage them to open up, sharing their thoughts or seeking advice, as you're dedicated to brightening their day. Remind them that if they ever feel a bit low, you're here to effortlessly bring a smile to their face." |
Atharva07/pii_detect_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: document
dtype: int64
- name: full_text
dtype: string
- name: tokens
sequence: string
- name: trailing_whitespace
sequence: bool
- name: labels
sequence:
class_label:
names:
'0': B-EMAIL
'1': B-ID_NUM
'2': B-NAME_STUDENT
'3': B-PHONE_NUM
'4': B-STREET_ADDRESS
'5': B-URL_PERSONAL
'6': B-USERNAME
'7': I-ID_NUM
'8': I-NAME_STUDENT
'9': I-PHONE_NUM
'10': I-STREET_ADDRESS
'11': I-URL_PERSONAL
'12': O
splits:
- name: train
num_bytes: 107466375
num_examples: 6807
download_size: 25506099
dataset_size: 107466375
---
# Dataset Card for "pii_detect_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minh009/test1 | ---
license: openrail
task_categories:
- text-classification
language:
- en
size_categories:
- n<1K
--- |
yangtao9009/PASD_dataset | ---
license: apache-2.0
---
You can download our datasets from Huggingface now including [DIV2K](https://huggingface.co/datasets/yangtao9009/DIV2K), [Flickr2K](https://huggingface.co/datasets/yangtao9009/Flickr2K), [DIV8K](https://huggingface.co/datasets/yangtao9009/DIV8K), [FFHQ1024](https://huggingface.co/datasets/yangtao9009/FFHQ1024), [OSTv2](https://huggingface.co/datasets/yangtao9009/OSTv2), and [Unsplash2K](https://huggingface.co/datasets/yangtao9009/Unsplash2K). |
CyberHarem/hua_yuan_yurine_jashinchandropkick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 花園ゆりね
This is the dataset of 花園ゆりね, containing 276 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 276 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 638 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 276 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 276 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 276 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 276 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 276 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 638 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 638 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 638 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
alvarobartt/orca_dpo_pairs_openai | ---
dataset_info:
features:
- name: prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 35404518
num_examples: 12859
download_size: 19040901
dataset_size: 35404518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WiktorS/polish-news | ---
license: apache-2.0
task_categories:
- text-classification
- summarization
- text-generation
language:
- pl
size_categories:
- 100K<n<1M
---
This dataset contains more than 250k articles obtained from polish news site `tvp.info.pl`.
Main purpouse of collecting the data was to create a transformer-based model for text summarization.
Columns:
* `link` - link to article
* `title` - original title of the article
* `headline` - lead/headline of the article - first paragraph of the article visible directly from the page
* `content` - full textual contents of the article
Link to original repo: https://github.com/WiktorSob/scraper-tvp
Download the data:
```python
from datasets import load_dataset
dataset = load_dataset("WiktorS/polish-news")
``` |
mstz/optdigits | ---
language:
- en
tags:
- optdigits
- tabular_classification
- binary_classification
- multiclass_classification
- UCI
pretty_name: Optdigits
task_categories: # Full list at https://github.com/huggingface/hub-docs/blob/main/js/src/lib/interfaces/Types.ts
- tabular-classification
configs:
- optdigits
---
# Optdigits
The [Optdigits dataset](https://archive-beta.ics.uci.edu/dataset/80/optical+recognition+of+handwritten+digits) from the [UCI repository](https://archive-beta.ics.uci.edu/).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-----------------------|---------------------------|-------------------------|
| optdigits | Multiclass classification.| |
| 0 | Binary classification. | Is this a 0? |
| 1 | Binary classification. | Is this a 1? |
| 2 | Binary classification. | Is this a 2? |
| ... | Binary classification. | ... |
|
atsushi3110/sft-part-chosen-rejected-pairs | ---
license: creativeml-openrail-m
---
|
katarinagresova/Genomic_Benchmarks_human_nontata_promoters | ---
dataset_info:
features:
- name: seq
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7126511
num_examples: 27097
- name: test
num_bytes: 2375942
num_examples: 9034
download_size: 0
dataset_size: 9502453
---
# Dataset Card for "Genomic_Benchmarks_human_nontata_promoters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Udbhav44/qu-bit | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 410889
num_examples: 94
download_size: 100327
dataset_size: 410889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yjernite/prof_report__SD_v1.4_random_seeds__sd_21__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: paralegal
num_bytes: 3600
num_examples: 10
- name: bartender
num_bytes: 3504
num_examples: 6
- name: facilities_manager
num_bytes: 3600
num_examples: 10
- name: accountant
num_bytes: 3600
num_examples: 10
- name: graphic_designer
num_bytes: 3672
num_examples: 13
- name: network_administrator
num_bytes: 3408
num_examples: 2
- name: financial_manager
num_bytes: 3624
num_examples: 11
- name: baker
num_bytes: 3720
num_examples: 15
- name: security_guard
num_bytes: 3648
num_examples: 12
- name: artist
num_bytes: 3840
num_examples: 20
- name: author
num_bytes: 3600
num_examples: 10
- name: printing_press_operator
num_bytes: 3552
num_examples: 8
- name: public_relations_specialist
num_bytes: 3648
num_examples: 12
- name: sheet_metal_worker
num_bytes: 3576
num_examples: 9
- name: clergy
num_bytes: 3648
num_examples: 12
- name: payroll_clerk
num_bytes: 3552
num_examples: 8
- name: teller
num_bytes: 3816
num_examples: 19
- name: real_estate_broker
num_bytes: 3552
num_examples: 8
- name: customer_service_representative
num_bytes: 3600
num_examples: 10
- name: painter
num_bytes: 3768
num_examples: 17
- name: tractor_operator
num_bytes: 3480
num_examples: 5
- name: dental_hygienist
num_bytes: 3504
num_examples: 6
- name: industrial_engineer
num_bytes: 3576
num_examples: 9
- name: electrician
num_bytes: 3480
num_examples: 5
- name: head_cook
num_bytes: 3744
num_examples: 16
- name: health_technician
num_bytes: 3600
num_examples: 10
- name: carpet_installer
num_bytes: 3456
num_examples: 4
- name: purchasing_agent
num_bytes: 3624
num_examples: 11
- name: supervisor
num_bytes: 3696
num_examples: 14
- name: civil_engineer
num_bytes: 3648
num_examples: 12
- name: lawyer
num_bytes: 3720
num_examples: 15
- name: language_pathologist
num_bytes: 3600
num_examples: 10
- name: ceo
num_bytes: 3672
num_examples: 13
- name: computer_support_specialist
num_bytes: 3600
num_examples: 10
- name: postal_worker
num_bytes: 3672
num_examples: 13
- name: mechanical_engineer
num_bytes: 3648
num_examples: 12
- name: nursing_assistant
num_bytes: 3552
num_examples: 8
- name: dentist
num_bytes: 3624
num_examples: 11
- name: tutor
num_bytes: 3720
num_examples: 15
- name: butcher
num_bytes: 3648
num_examples: 12
- name: insurance_agent
num_bytes: 3528
num_examples: 7
- name: courier
num_bytes: 3720
num_examples: 15
- name: computer_programmer
num_bytes: 3624
num_examples: 11
- name: truck_driver
num_bytes: 3504
num_examples: 6
- name: mechanic
num_bytes: 3528
num_examples: 7
- name: marketing_manager
num_bytes: 3528
num_examples: 7
- name: sales_manager
num_bytes: 3528
num_examples: 7
- name: correctional_officer
num_bytes: 3696
num_examples: 14
- name: manager
num_bytes: 3648
num_examples: 12
- name: underwriter
num_bytes: 3672
num_examples: 13
- name: executive_assistant
num_bytes: 3600
num_examples: 10
- name: designer
num_bytes: 3648
num_examples: 12
- name: groundskeeper
num_bytes: 3480
num_examples: 5
- name: mental_health_counselor
num_bytes: 3672
num_examples: 13
- name: aerospace_engineer
num_bytes: 3648
num_examples: 12
- name: taxi_driver
num_bytes: 3696
num_examples: 14
- name: nurse
num_bytes: 3576
num_examples: 9
- name: data_entry_keyer
num_bytes: 3624
num_examples: 11
- name: musician
num_bytes: 3696
num_examples: 14
- name: event_planner
num_bytes: 3552
num_examples: 8
- name: writer
num_bytes: 3672
num_examples: 13
- name: cook
num_bytes: 3792
num_examples: 18
- name: welder
num_bytes: 3624
num_examples: 11
- name: producer
num_bytes: 3744
num_examples: 16
- name: hairdresser
num_bytes: 3600
num_examples: 10
- name: farmer
num_bytes: 3528
num_examples: 7
- name: construction_worker
num_bytes: 3504
num_examples: 6
- name: air_conditioning_installer
num_bytes: 3432
num_examples: 3
- name: electrical_engineer
num_bytes: 3648
num_examples: 12
- name: occupational_therapist
num_bytes: 3624
num_examples: 11
- name: career_counselor
num_bytes: 3600
num_examples: 10
- name: interior_designer
num_bytes: 3624
num_examples: 11
- name: jailer
num_bytes: 3744
num_examples: 16
- name: office_clerk
num_bytes: 3624
num_examples: 11
- name: market_research_analyst
num_bytes: 3576
num_examples: 9
- name: laboratory_technician
num_bytes: 3624
num_examples: 11
- name: social_assistant
num_bytes: 3744
num_examples: 16
- name: medical_records_specialist
num_bytes: 3576
num_examples: 9
- name: machinery_mechanic
num_bytes: 3552
num_examples: 8
- name: police_officer
num_bytes: 3672
num_examples: 13
- name: software_developer
num_bytes: 3528
num_examples: 7
- name: clerk
num_bytes: 3720
num_examples: 15
- name: salesperson
num_bytes: 3648
num_examples: 12
- name: social_worker
num_bytes: 3744
num_examples: 16
- name: director
num_bytes: 3720
num_examples: 15
- name: fast_food_worker
num_bytes: 3696
num_examples: 14
- name: singer
num_bytes: 3792
num_examples: 18
- name: metal_worker
num_bytes: 3576
num_examples: 9
- name: cleaner
num_bytes: 3792
num_examples: 18
- name: computer_systems_analyst
num_bytes: 3600
num_examples: 10
- name: dental_assistant
num_bytes: 3504
num_examples: 6
- name: psychologist
num_bytes: 3696
num_examples: 14
- name: machinist
num_bytes: 3648
num_examples: 12
- name: therapist
num_bytes: 3648
num_examples: 12
- name: veterinarian
num_bytes: 3576
num_examples: 9
- name: teacher
num_bytes: 3720
num_examples: 15
- name: architect
num_bytes: 3720
num_examples: 15
- name: office_worker
num_bytes: 3672
num_examples: 13
- name: drywall_installer
num_bytes: 3480
num_examples: 5
- name: nutritionist
num_bytes: 3480
num_examples: 5
- name: librarian
num_bytes: 3672
num_examples: 13
- name: childcare_worker
num_bytes: 3576
num_examples: 9
- name: school_bus_driver
num_bytes: 3696
num_examples: 14
- name: file_clerk
num_bytes: 3600
num_examples: 10
- name: logistician
num_bytes: 3576
num_examples: 9
- name: scientist
num_bytes: 3648
num_examples: 12
- name: teaching_assistant
num_bytes: 3672
num_examples: 13
- name: radiologic_technician
num_bytes: 3600
num_examples: 10
- name: manicurist
num_bytes: 3576
num_examples: 9
- name: community_manager
num_bytes: 3576
num_examples: 9
- name: carpenter
num_bytes: 3480
num_examples: 5
- name: claims_appraiser
num_bytes: 3576
num_examples: 9
- name: dispatcher
num_bytes: 3528
num_examples: 7
- name: cashier
num_bytes: 3600
num_examples: 10
- name: roofer
num_bytes: 3504
num_examples: 6
- name: photographer
num_bytes: 3792
num_examples: 18
- name: detective
num_bytes: 3648
num_examples: 12
- name: financial_advisor
num_bytes: 3576
num_examples: 9
- name: wholesale_buyer
num_bytes: 3672
num_examples: 13
- name: it_specialist
num_bytes: 3552
num_examples: 8
- name: pharmacy_technician
num_bytes: 3504
num_examples: 6
- name: engineer
num_bytes: 3648
num_examples: 12
- name: mover
num_bytes: 3768
num_examples: 17
- name: plane_mechanic
num_bytes: 3624
num_examples: 11
- name: interviewer
num_bytes: 3672
num_examples: 13
- name: massage_therapist
num_bytes: 3624
num_examples: 11
- name: dishwasher
num_bytes: 3672
num_examples: 13
- name: fitness_instructor
num_bytes: 3600
num_examples: 10
- name: credit_counselor
num_bytes: 3624
num_examples: 11
- name: stocker
num_bytes: 3816
num_examples: 19
- name: pharmacist
num_bytes: 3672
num_examples: 13
- name: doctor
num_bytes: 3672
num_examples: 13
- name: compliance_officer
num_bytes: 3648
num_examples: 12
- name: aide
num_bytes: 3768
num_examples: 17
- name: bus_driver
num_bytes: 3672
num_examples: 13
- name: financial_analyst
num_bytes: 3624
num_examples: 11
- name: receptionist
num_bytes: 3504
num_examples: 6
- name: janitor
num_bytes: 3672
num_examples: 13
- name: plumber
num_bytes: 3504
num_examples: 6
- name: physical_therapist
num_bytes: 3600
num_examples: 10
- name: inventory_clerk
num_bytes: 3552
num_examples: 8
- name: firefighter
num_bytes: 3600
num_examples: 10
- name: coach
num_bytes: 3696
num_examples: 14
- name: maid
num_bytes: 3648
num_examples: 12
- name: pilot
num_bytes: 3696
num_examples: 14
- name: repair_worker
num_bytes: 3624
num_examples: 11
download_size: 871516
dataset_size: 529248
---
# Dataset Card for "prof_report__SD_v1.4_random_seeds__sd_21__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_myself_coordinate_subjects | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 208
num_examples: 1
- name: train
num_bytes: 168
num_examples: 1
download_size: 0
dataset_size: 376
---
# Dataset Card for "MULTI_VALUE_stsb_myself_coordinate_subjects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T14:51:20.480254](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-10-17T14-51-20.480254.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.364618288590604,\n\
\ \"em_stderr\": 0.004929197624393639,\n \"f1\": 0.42417365771812215,\n\
\ \"f1_stderr\": 0.004776577842624861,\n \"acc\": 0.5759284047791582,\n\
\ \"acc_stderr\": 0.011665477241539865\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.364618288590604,\n \"em_stderr\": 0.004929197624393639,\n\
\ \"f1\": 0.42417365771812215,\n \"f1_stderr\": 0.004776577842624861\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3191811978771797,\n \
\ \"acc_stderr\": 0.012840345676251651\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|drop|3_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T14-51-20.480254.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-51-20.480254.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|winogrande|5_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T14-51-20.480254.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- results_2023-09-02T17:59:05.420313.parquet
- split: 2023_10_17T14_51_20.480254
path:
- results_2023-10-17T14-51-20.480254.parquet
- split: latest
path:
- results_2023-10-17T14-51-20.480254.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T14:51:20.480254](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-10-17T14-51-20.480254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.364618288590604,
"em_stderr": 0.004929197624393639,
"f1": 0.42417365771812215,
"f1_stderr": 0.004776577842624861,
"acc": 0.5759284047791582,
"acc_stderr": 0.011665477241539865
},
"harness|drop|3": {
"em": 0.364618288590604,
"em_stderr": 0.004929197624393639,
"f1": 0.42417365771812215,
"f1_stderr": 0.004776577842624861
},
"harness|gsm8k|5": {
"acc": 0.3191811978771797,
"acc_stderr": 0.012840345676251651
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
biglam/oldbookillustrations | Invalid username or password. |
Mai321/ZeroTwo-PlugSuit | ---
license: openrail
---
|
NathanGavenski/CartPole-v1 | ---
license: mit
tags:
- Imitation Learning
- Expert Trajectory
pretty_name: CartPole-v1 Expert Dataset
size_categories:
- 10M<n<100M
---
# CartPole-v1 - Imitation Learning Datasets
This is a dataset created by [Imitation Learning Datasets](https://github.com/NathanGavenski/IL-Datasets) project.
It was created by using Stable Baselines weights from a PPO policy from [HuggingFace](https://huggingface.co/sb3/ppo-CartPole-v1).
## Description
The dataset consists of 1,000 episodes with an average episodic reward of 500.
Each entry consists of:
```
obs (list): observation with length 4.
action (int): action (0 or 1).
reward (float): reward point for that timestep.
episode_returns (bool): if that state was the initial timestep for an episode.
```
## Usage
Feel free to download and use the `teacher.jsonl` dataset as you please.
If you are interested in using our PyTorch Dataset implementation, feel free to check the [IL Datasets](https://github.com/NathanGavenski/IL-Datasets/blob/main/src/imitation_datasets/dataset/dataset.py) project.
There, we implement a base Dataset that downloads this dataset and all other datasets directly from HuggingFace.
The Baseline Dataset also allows for more control over train and test splits and how many episodes you want to use (in cases where the 1k episodes are not necessary).
## Citation
Coming soon. |
PL-MTEB/cdsce-pairclassification | ---
license: cc-by-nc-sa-4.0
---
|
Juniplayground/ragas-retreival_top1 | ---
dataset_info:
features:
- name: benchmark_date
dtype: string
- name: question
dtype: string
- name: answer
dtype: float64
- name: contexts
dtype: string
- name: ground_truth
dtype: string
- name: answer_relevancy
dtype: float64
- name: context_precision
dtype: float64
- name: rag_benchmark_decision
dtype: string
- name: response_benchmark_decision
dtype: string
- name: correct_rag_benchmark
dtype: int64
- name: incorrect_rag_benchmark
dtype: int64
- name: correct_response_benchmark
dtype: int64
- name: incorrect_response_benchmark
dtype: int64
splits:
- name: train
num_bytes: 3689380
num_examples: 1180
download_size: 1095559
dataset_size: 3689380
---
# Dataset Card for "ragas-retreival_top1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
augmxnt/ultra-orca-boros-en-ja-v1 | ---
license: apache-2.0
---
EN/JA dataset used for [shisa-7b-v1](https://hf.co/augmxnt/shisa-7b-v1) - see details in that model's readme. |
NagaSaiAbhinay/CheckpointMergerSamples | ---
license: openrail
---
|
CyberHarem/power_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of power/パワー/帕瓦/파워 (Nikke: Goddess of Victory)
This is the dataset of power/パワー/帕瓦/파워 (Nikke: Goddess of Victory), containing 500 images and their tags.
The core tags of this character are `long_hair, horns, red_horns, symbol-shaped_pupils, cross-shaped_pupils, hair_between_eyes, demon_horns, blonde_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/power_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 472.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/power_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1286 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/power_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 861.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/power_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1286 | 1.69 GiB | [Download](https://huggingface.co/datasets/CyberHarem/power_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/power_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, black_necktie, collared_shirt, looking_at_viewer, sharp_teeth, solo, white_shirt, open_mouth, simple_background, blue_jacket, smile, upper_body, yellow_eyes, white_background, medium_breasts, blush |
| 1 | 7 |  |  |  |  |  | 1girl, black_necktie, collared_shirt, looking_at_viewer, open_mouth, red_eyes, sharp_teeth, solo, white_shirt, demon_girl, upper_body, black_jacket, simple_background, tongue_out |
| 2 | 5 |  |  |  |  |  | 1girl, black_necktie, black_pants, collared_shirt, sharp_teeth, solo, white_shirt, holding_weapon, looking_at_viewer, open_mouth, smile, yellow_eyes, blue_jacket, holding_scythe, medium_breasts, pants_rolled_up, sneakers, full_body, holding_hammer, sky, white_footwear |
| 3 | 7 |  |  |  |  |  | 1girl, black_necktie, black_pants, collared_shirt, looking_at_viewer, sharp_teeth, solo, white_shirt, grin, hand_on_hip, medium_breasts, red_eyes, holding_scythe, long_sleeves, simple_background, blue_jacket, cowboy_shot, holding_weapon, bangs, demon_girl, open_clothes |
| 4 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, pink_hair, shirt, solo, open_mouth, sitting, smile, barefoot, collarbone, long_sleeves, red_eyes, sharp_teeth, simple_background, white_background, feet, short_shorts, toes |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, yellow_eyes, black_ribbon, maid_headdress, sharp_teeth, simple_background, twintails, frills, maid_apron, smile, alternate_costume, black_dress, bow, open_mouth, red_background, white_background |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, anus, hetero, penis, solo_focus, ass, blush, bottomless, pussy, sharp_teeth, vaginal, long_sleeves, looking_at_viewer, looking_back, mosaic_censoring, open_mouth, orange_eyes, pov, sex_from_behind, sweat, navel, pink_shirt, thighs, toilet, uncensored, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_necktie | collared_shirt | looking_at_viewer | sharp_teeth | solo | white_shirt | open_mouth | simple_background | blue_jacket | smile | upper_body | yellow_eyes | white_background | medium_breasts | blush | red_eyes | demon_girl | black_jacket | tongue_out | black_pants | holding_weapon | holding_scythe | pants_rolled_up | sneakers | full_body | holding_hammer | sky | white_footwear | grin | hand_on_hip | long_sleeves | cowboy_shot | bangs | open_clothes | pink_hair | shirt | sitting | barefoot | collarbone | feet | short_shorts | toes | black_ribbon | maid_headdress | twintails | frills | maid_apron | alternate_costume | black_dress | bow | red_background | 1boy | anus | hetero | penis | solo_focus | ass | bottomless | pussy | vaginal | looking_back | mosaic_censoring | orange_eyes | pov | sex_from_behind | sweat | navel | pink_shirt | thighs | toilet | uncensored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------------|:--------------------|:--------------|:-------|:--------------|:-------------|:--------------------|:--------------|:--------|:-------------|:--------------|:-------------------|:-----------------|:--------|:-----------|:-------------|:---------------|:-------------|:--------------|:-----------------|:-----------------|:------------------|:-----------|:------------|:-----------------|:------|:-----------------|:-------|:--------------|:---------------|:--------------|:--------|:---------------|:------------|:--------|:----------|:-----------|:-------------|:-------|:---------------|:-------|:---------------|:-----------------|:------------|:---------|:-------------|:--------------------|:--------------|:------|:-----------------|:-------|:-------|:---------|:--------|:-------------|:------|:-------------|:--------|:----------|:---------------|:-------------------|:--------------|:------|:------------------|:--------|:--------|:-------------|:---------|:---------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | | | | X | | X | X | | | X | X | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | X | X | | X | X | | X | | | X | | | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | X | | X | X | | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
pooyaphoenix/hystoclass | ---
license: openrail
task_categories:
- text-classification
- token-classification
language:
- fa
tags:
- tabular_data
- Text Classification
- Social Networks
- Ensemble Learning
pretty_name: hystoclass
size_categories:
- 1K<n<10K
---
# Dataset Summary
**hystoclass** (hybrid social text and tabular classification)has been collected from Instagram stories with privacy in mind. In addition to the texts published in the stories, this dataset has graphic features such as background color, text color, and font. also has a Textual feature named 'content' in the Persian language.
# Classes
This dataset is divided into **18 classes** by human supervision:
Event, Political, Advertising and business, Romantic, Motivational, Literature, Social Networks, Scientific, Social, IT, Advices, Academic, Cosmetic and Feminine, Religious, Sport, Property and housing, Tourism and Medical.
[Github](https://github.com/pooyaphoenix/hystoclass)
[Email](https://pooyachavoshi@gmail.com)
|
doudou1206/fill50k | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 454411979.0
num_examples: 50000
download_size: 316021533
dataset_size: 454411979.0
---
# Dataset Card for "fill50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/kilt_codec_economics | ---
pretty_name: '`kilt/codec/economics`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `kilt/codec/economics`
The `kilt/codec/economics` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/kilt#kilt/codec/economics).
# Data
This dataset provides:
- `queries` (i.e., topics); count=14
- `qrels`: (relevance assessments); count=1,970
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/kilt_codec_economics', 'queries')
for record in queries:
record # {'query_id': ..., 'query': ..., 'domain': ..., 'guidelines': ...}
qrels = load_dataset('irds/kilt_codec_economics', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{mackie2022codec,
title={CODEC: Complex Document and Entity Collection},
author={Mackie, Iain and Owoicho, Paul and Gemmell, Carlos and Fischer, Sophie and MacAvaney, Sean and Dalton, Jeffery},
booktitle={Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2022}
}
```
|
benchang1110/technewstw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 331062928
num_examples: 113604
download_size: 215556935
dataset_size: 331062928
---
# Dataset Card for "technewstw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mbazaNLP/fleurs-kinyarwanda | ---
annotations_creators:
- expert-generated
- crowdsourced
language_creators:
- crowdsourced
- expert-generated
language:
- rw
license:
- cc-by-4.0
size_categories:
- 1K<n<10K
task_categories:
- automatic-speech-recognition
tags:
- speech-recognition
- fleurs-dataset
pretty_name: Fleurs dataset Kinyarwanda
---
# Fleur Kinyarwanda dataset
Fleur is a multilingual text and audio dataset. The original dataset was created by [Google](https://huggingface.co/datasets/google/fleurs) . The dataset can be used when building speech to text, speech to text translation and speech to speech translation. It is a good tool to benchmark speech application especially across languages. As of present Kinyarwanda did not have a fleur dataset hindering opportunities for building Kinyarwanda speech technology.
This dataset was created by 29 linguists that participated in the Training NLP for Linguist with a focus on Machine translation.
## Dataset Creation
The recordings are made of 2-4 different recordings for each sentence
### Data Fields
The data fields are the same among all splits.
- **id** (int): ID of audio sample
- **num_samples** (int): Number of float values
- **path** (str): Path to the audio file
- **audio** (dict): Audio object including loaded audio array, sampling rate and path ot audio
- **raw_transcription** (str): The non-normalized transcription of the audio file
- **transcription** (str): Transcription of the audio file
- **gender** (int): Class id of gender
- **lang_id** (int): Class id of language
- **lang_group_id** (int): Class id of language group
### Contribution
Thanks to all the linguist who contributed and their teacher Samuel Olanrewaju and thanks [Kleber Kabanda](https://huggingface.co/Kleber) for curating and uploading the dataset |
hlt-lab/dialogsumsample-change_names | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: reference
dtype: string
splits:
- name: train
num_bytes: 51251
num_examples: 59
download_size: 39410
dataset_size: 51251
---
# Dataset Card for "dialogsumsample-change_names"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Estwld/atomic2020-instruct | ---
dataset_info:
features:
- name: knowledge_type
dtype: string
- name: task_type
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 24883204
num_examples: 58390
- name: test
num_bytes: 38577951
num_examples: 89328
- name: train
num_bytes: 257687540
num_examples: 600586
download_size: 51924047
dataset_size: 321148695
---
# Dataset Card for "atomic2020-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_damerajee__Gaja-v2.00 | ---
pretty_name: Evaluation run of damerajee/Gaja-v2.00
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [damerajee/Gaja-v2.00](https://huggingface.co/damerajee/Gaja-v2.00) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_damerajee__Gaja-v2.00\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-06T22:55:47.087661](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Gaja-v2.00/blob/main/results_2024-03-06T22-55-47.087661.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4089443174753084,\n\
\ \"acc_stderr\": 0.034165114593211754,\n \"acc_norm\": 0.4147160920686785,\n\
\ \"acc_norm_stderr\": 0.035075942143371355,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842899,\n \"mc2\": 0.41500345275114237,\n\
\ \"mc2_stderr\": 0.014673369121203574\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4658703071672355,\n \"acc_stderr\": 0.014577311315231106,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490976\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5660227046405099,\n\
\ \"acc_stderr\": 0.004946089230153019,\n \"acc_norm\": 0.7579167496514638,\n\
\ \"acc_norm_stderr\": 0.0042746901436291505\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.03775205013583638,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.03775205013583638\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n\
\ \"acc_stderr\": 0.028327743091561063,\n \"acc_norm\": 0.45483870967741935,\n\
\ \"acc_norm_stderr\": 0.028327743091561063\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41919191919191917,\n \"acc_stderr\": 0.035155207286704175,\n \"\
acc_norm\": 0.41919191919191917,\n \"acc_norm_stderr\": 0.035155207286704175\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.03597524411734577,\n\
\ \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.03597524411734577\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.024603626924097413,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.024603626924097413\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.47522935779816516,\n \"acc_stderr\": 0.021410999753635918,\n \"\
acc_norm\": 0.47522935779816516,\n \"acc_norm_stderr\": 0.021410999753635918\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5049019607843137,\n \"acc_stderr\": 0.035091433756067845,\n \"\
acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.035091433756067845\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.036803503712864595,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.036803503712864595\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5341880341880342,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.5341880341880342,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.017867695938429778,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.017867695938429778\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.026817718130348916,\n\
\ \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.026817718130348916\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210735,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210735\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883037,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883037\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.02839677044411129,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.02839677044411129\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4567901234567901,\n \"acc_stderr\": 0.027716661650194045,\n\
\ \"acc_norm\": 0.4567901234567901,\n \"acc_norm_stderr\": 0.027716661650194045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3155149934810952,\n\
\ \"acc_stderr\": 0.011869184843058642,\n \"acc_norm\": 0.3155149934810952,\n\
\ \"acc_norm_stderr\": 0.011869184843058642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4068627450980392,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.047245774057315705,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.047245774057315705\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004129,\n\
\ \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.03740059382029321,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.03740059382029321\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.03753638955761691,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.03753638955761691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842899,\n \"mc2\": 0.41500345275114237,\n\
\ \"mc2_stderr\": 0.014673369121203574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875825\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674337\n }\n}\n```"
repo_url: https://huggingface.co/damerajee/Gaja-v2.00
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|arc:challenge|25_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|gsm8k|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hellaswag|10_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T22-55-47.087661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T22-55-47.087661.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- '**/details_harness|winogrande|5_2024-03-06T22-55-47.087661.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-06T22-55-47.087661.parquet'
- config_name: results
data_files:
- split: 2024_03_06T22_55_47.087661
path:
- results_2024-03-06T22-55-47.087661.parquet
- split: latest
path:
- results_2024-03-06T22-55-47.087661.parquet
---
# Dataset Card for Evaluation run of damerajee/Gaja-v2.00
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [damerajee/Gaja-v2.00](https://huggingface.co/damerajee/Gaja-v2.00) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_damerajee__Gaja-v2.00",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-06T22:55:47.087661](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Gaja-v2.00/blob/main/results_2024-03-06T22-55-47.087661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4089443174753084,
"acc_stderr": 0.034165114593211754,
"acc_norm": 0.4147160920686785,
"acc_norm_stderr": 0.035075942143371355,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842899,
"mc2": 0.41500345275114237,
"mc2_stderr": 0.014673369121203574
},
"harness|arc:challenge|25": {
"acc": 0.4658703071672355,
"acc_stderr": 0.014577311315231106,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490976
},
"harness|hellaswag|10": {
"acc": 0.5660227046405099,
"acc_stderr": 0.004946089230153019,
"acc_norm": 0.7579167496514638,
"acc_norm_stderr": 0.0042746901436291505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.03775205013583638,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.03775205013583638
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.028327743091561063,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.028327743091561063
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41919191919191917,
"acc_stderr": 0.035155207286704175,
"acc_norm": 0.41919191919191917,
"acc_norm_stderr": 0.035155207286704175
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.03597524411734577,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.03597524411734577
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.024603626924097413,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.024603626924097413
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47522935779816516,
"acc_stderr": 0.021410999753635918,
"acc_norm": 0.47522935779816516,
"acc_norm_stderr": 0.021410999753635918
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.035091433756067845,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.035091433756067845
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.036803503712864595,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.036803503712864595
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5341880341880342,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.5341880341880342,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.017867695938429778,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.017867695938429778
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.026817718130348916,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.026817718130348916
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210735,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.02839677044411129,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.02839677044411129
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.027716661650194045,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.027716661650194045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3155149934810952,
"acc_stderr": 0.011869184843058642,
"acc_norm": 0.3155149934810952,
"acc_norm_stderr": 0.011869184843058642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.047245774057315705,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.047245774057315705
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.03740059382029321,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.03740059382029321
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.03753638955761691,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.03753638955761691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842899,
"mc2": 0.41500345275114237,
"mc2_stderr": 0.014673369121203574
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875825
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674337
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anandhuvasudev/southindiandish | ---
license: apache-2.0
dataset_info:
features:
- name: Input
dtype: string
- name: Output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2718343
num_examples: 1000
download_size: 1149776
dataset_size: 2718343
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo16_2_64_mix_50_kl_0.1_prm_160m_thr_0.1_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_10
num_bytes: 43401162
num_examples: 18928
- name: epoch_11
num_bytes: 43399191
num_examples: 18928
- name: epoch_12
num_bytes: 43400099
num_examples: 18928
- name: epoch_13
num_bytes: 43399481
num_examples: 18928
- name: epoch_14
num_bytes: 43399301
num_examples: 18928
- name: epoch_15
num_bytes: 43400522
num_examples: 18928
- name: epoch_16
num_bytes: 43400534
num_examples: 18928
- name: epoch_17
num_bytes: 43402147
num_examples: 18928
- name: epoch_18
num_bytes: 43400876
num_examples: 18928
- name: epoch_19
num_bytes: 43401349
num_examples: 18928
- name: epoch_20
num_bytes: 43400325
num_examples: 18928
- name: epoch_21
num_bytes: 43401425
num_examples: 18928
- name: epoch_22
num_bytes: 43400328
num_examples: 18928
- name: epoch_23
num_bytes: 43398628
num_examples: 18928
- name: epoch_24
num_bytes: 43398523
num_examples: 18928
- name: epoch_25
num_bytes: 43399495
num_examples: 18928
- name: epoch_26
num_bytes: 43400842
num_examples: 18928
- name: epoch_27
num_bytes: 43398747
num_examples: 18928
- name: epoch_28
num_bytes: 43401571
num_examples: 18928
- name: epoch_29
num_bytes: 43400288
num_examples: 18928
- name: epoch_0
num_bytes: 43742688
num_examples: 18928
- name: epoch_1
num_bytes: 43771383
num_examples: 18928
- name: epoch_2
num_bytes: 43617215
num_examples: 18928
- name: epoch_3
num_bytes: 43517649
num_examples: 18928
- name: epoch_4
num_bytes: 43473677
num_examples: 18928
- name: epoch_5
num_bytes: 43452996
num_examples: 18928
- name: epoch_6
num_bytes: 43444718
num_examples: 18928
- name: epoch_7
num_bytes: 43439699
num_examples: 18928
- name: epoch_8
num_bytes: 43434978
num_examples: 18928
- name: epoch_9
num_bytes: 43435447
num_examples: 18928
download_size: 924706105
dataset_size: 1303335284
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_29-*
---
|
lauralex/dbdicons | ---
license: mit
---
|
asun17904/imdb-test | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: test
num_bytes: 19590411.0
num_examples: 15000
download_size: 12828803
dataset_size: 19590411.0
---
# Dataset Card for "imdb-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
brian-lim/smile_style_orca | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 34830270
num_examples: 31526
download_size: 2812219
dataset_size: 34830270
---
# Dataset Card for "smile_style_orca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712810567 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21850
num_examples: 47
download_size: 11696
dataset_size: 21850
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712810567"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lexaizero/itulahpokoknya | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.