datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bassie96code/train_wettekst | ---
dataset_info:
features:
- name: tok_wettekst
sequence: string
- name: aantal tokens
dtype: int64
- name: label lijsten
sequence: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 32598
num_examples: 80
download_size: 10866
dataset_size: 32598
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "train_wettekst"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tech9/sissy-image-dataset1 | ---
license: wtfpl
---
|
harpreetsahota/diverse-token-sampler | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 7838
num_examples: 68
download_size: 7314
dataset_size: 7838
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
pretty_name: Diverse Token Sampler
---
# 🌈 Diverse Token Sampler Dataset 🌟
## Overview 📜
Welcome to the `DiverseTokenSampler` dataset! 🚀 This one-of-a-kind collection is ingeniously crafted to challenge and test the boundaries of LLMs, especially in evaluating their versatility and robustness. 🤖 It encompasses a broad spectrum of prompts, from conventional linguistic constructs to the most perplexing arrays of mixed-language scripts, emojis, 🎉 technical code snippets, and even nonsensical strings. An invaluable resource for researchers and developers 🧑💻 aiming to probe the depths and limitations of their NLP models with diverse and complex inputs.
## Contents 📚
`DiverseTokenSampler` includes an eclectic mix of prompt types:
- **📖 Narrative Beginnings**: Unleash creativity in storytelling.
- **🌄 Descriptive Texts**: Paint vivid pictures with words.
- **💬 Dialogue Initiations**: Spark engaging conversations.
- **🔬 Technical and Academic Texts**: Dive into specialized knowledge.
- **🎶 Poetic Openings**: Explore the beauty of lyrical language.
- **💡 Thought-Provoking Statements**: Stimulate reflective thinking.
- **🏛 Historical Contexts**: Travel through time with historical narratives.
- **🌌 Fictional World-building**: Craft realms of imagination.
- **🔍 Mystery Setups**: Invoke intrigue and curiosity.
- **🧩 Mixed Content**: A kaleidoscope of languages, emojis, and code.
- **❓ Non-linguistic**: Challenge models with abstract character assortments.
## Applications 🛠
Use `DiverseTokenSampler` for:
- **🎓 Model Training and Fine-Tuning**: Augment models' linguistic versatility.
- **🔗 Robustness Testing**: Gauge models against unusual and unexpected inputs.
- **⚖️ Bias Detection**: Uncover and address potential biases.
- **🧠 Language Understanding Evaluation**: Assess comprehension across varied prompts.
## Contribution 🤝
Eager for your ideas and improvements! 🌟 If you have novel prompts or enhancements, feel free to submit a pull request or open an issue.
## License 📄
This dataset is open-sourced under the [MIT License](LICENSE.md).
|
voyagar/cloud_matrix_summary | ---
license: unknown
---
|
andersonbcdefg/red_teaming_reward_modeling_pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: explanation
dtype: string
- name: preferred
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 41305999
num_examples: 35279
download_size: 0
dataset_size: 41305999
---
# Dataset Card for "red_teaming_reward_modeling_pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KomeijiForce/ARC-Easy-Explained-by-ChatGPT | ---
task_categories:
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
This is a dataset with explanations from ChatGPT for the correct and incorrect answers in ARC-Easy. The explanations are generated by prompting ChatGPT with answer keys and in-context examples. We expect this dataset to be an useful source for understanding the commonsense reasoning ability of LLMs or training other LMs. |
Jour/Translation | ---
task_categories:
- translation
size_categories:
- 100K<n<1M
---
A dataset for translation. |
SLKpnu/sequential | ---
license: mit
---
|
CyberHarem/vira_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vira/ヴィーラ (Granblue Fantasy)
This is the dataset of vira/ヴィーラ (Granblue Fantasy), containing 27 images and their tags.
The core tags of this character are `blonde_hair, long_hair, red_eyes, bow, hair_bow, ponytail, breasts, bangs, hair_between_eyes, hair_ornament, black_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 27.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 20.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 37.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 26.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 45.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, hair_flower, looking_at_viewer, obi, open_mouth, solo, blush, floral_print, red_kimono, :d, sidelocks, wide_sleeves, hamaya, holding, long_sleeves, official_alternate_costume, upper_body |
| 1 | 13 |  |  |  |  |  | 1girl, solo, armor, looking_at_viewer, smile, sword, cleavage, dress, holding_weapon, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_flower | looking_at_viewer | obi | open_mouth | solo | blush | floral_print | red_kimono | :d | sidelocks | wide_sleeves | hamaya | holding | long_sleeves | official_alternate_costume | upper_body | armor | smile | sword | cleavage | dress | holding_weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:------|:-------------|:-------|:--------|:---------------|:-------------|:-----|:------------|:---------------|:---------|:----------|:---------------|:-----------------------------|:-------------|:--------|:--------|:--------|:-----------|:--------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X |
|
bdsaglam/webnlg-jerx-sft-openai | ---
dataset_info:
features:
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 17422745
num_examples: 35426
- name: dev
num_bytes: 2199484
num_examples: 4464
- name: test
num_bytes: 3840482
num_examples: 7305
download_size: 2699070
dataset_size: 23462711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
MedAliFarhat/medication_description | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 48109
num_examples: 100
download_size: 32385
dataset_size: 48109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gg-ai/dataset-072123 | ---
dataset_info:
features:
- name: text
dtype: string
- name: sent
dtype: int64
- name: text_0
dtype: string
- name: text_1
dtype: string
- name: text_2
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 460653.36842105264
num_examples: 613
- name: test
num_bytes: 81910.63157894737
num_examples: 109
download_size: 326967
dataset_size: 542564.0
---
# Dataset Card for "dataset-072123"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Mihaiii__Metis-0.5 | ---
pretty_name: Evaluation run of Mihaiii/Metis-0.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mihaiii/Metis-0.5](https://huggingface.co/Mihaiii/Metis-0.5) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Metis-0.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T21:03:34.268283](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.5/blob/main/results_2023-12-29T21-03-34.268283.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6205495686116762,\n\
\ \"acc_stderr\": 0.03278601697551399,\n \"acc_norm\": 0.6253153124790392,\n\
\ \"acc_norm_stderr\": 0.033438294991220995,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4932590097591299,\n\
\ \"mc2_stderr\": 0.015440588307546098\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759093\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n\
\ \"acc_stderr\": 0.004745103543901293,\n \"acc_norm\": 0.8376817367058355,\n\
\ \"acc_norm_stderr\": 0.003679889125399814\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.02499305339776483,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.02499305339776483\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229153,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229153\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531772,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531772\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505514,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249765,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249765\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4932590097591299,\n\
\ \"mc2_stderr\": 0.015440588307546098\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403107\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4291129643669447,\n \
\ \"acc_stderr\": 0.013633369425647244\n }\n}\n```"
repo_url: https://huggingface.co/Mihaiii/Metis-0.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-03-34.268283.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- '**/details_harness|winogrande|5_2023-12-29T21-03-34.268283.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T21-03-34.268283.parquet'
- config_name: results
data_files:
- split: 2023_12_29T21_03_34.268283
path:
- results_2023-12-29T21-03-34.268283.parquet
- split: latest
path:
- results_2023-12-29T21-03-34.268283.parquet
---
# Dataset Card for Evaluation run of Mihaiii/Metis-0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.5](https://huggingface.co/Mihaiii/Metis-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Metis-0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:03:34.268283](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.5/blob/main/results_2023-12-29T21-03-34.268283.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6205495686116762,
"acc_stderr": 0.03278601697551399,
"acc_norm": 0.6253153124790392,
"acc_norm_stderr": 0.033438294991220995,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4932590097591299,
"mc2_stderr": 0.015440588307546098
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759093
},
"harness|hellaswag|10": {
"acc": 0.6546504680342561,
"acc_stderr": 0.004745103543901293,
"acc_norm": 0.8376817367058355,
"acc_norm_stderr": 0.003679889125399814
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776483,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776483
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229153,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229153
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4932590097591299,
"mc2_stderr": 0.015440588307546098
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403107
},
"harness|gsm8k|5": {
"acc": 0.4291129643669447,
"acc_stderr": 0.013633369425647244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
psiyou/ambient_noise_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 12264640812.875
num_examples: 5575
download_size: 11869076631
dataset_size: 12264640812.875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MortenTabaka/LandCover-Aerial-Imagery-for-semantic-segmentation | ---
license: cc-by-nc-sa-4.0
task_categories:
- image-segmentation
---
# LandCover.ai: Dataset for Automatic Mapping of Buildings, Woodlands, Water and Roads from Aerial Imagery
My project based on the dataset, can be found on Github: https://github.com/MortenTabaka/Semantic-segmentation-of-LandCover.ai-dataset
The dataset used in this project is the [Landcover.ai Dataset](https://landcover.ai.linuxpolska.com/),
which was originally published with [LandCover.ai: Dataset for Automatic Mapping of Buildings, Woodlands, Water and Roads from Aerial Imagery paper](https://arxiv.org/abs/2005.02264)
also accessible on [PapersWithCode](https://paperswithcode.com/paper/landcover-ai-dataset-for-automatic-mapping-of).
**Please note that I am not the author or owner of this dataset, and I am using it under the terms of the license specified by the original author.
All credits for the dataset go to the original author and contributors.**
---
license: cc-by-nc-sa-4.0
--- |
mponty/code_champs_solutions | ---
dataset_info:
features:
- name: submission_id
dtype: string
- name: problem_id
dtype: string
- name: date
dtype: int64
- name: language
dtype: string
- name: verdict
dtype: string
- name: cpu_time
dtype: int64
- name: memory
dtype: int64
- name: code
dtype: string
- name: source
dtype: string
- name: testcount
dtype: int64
- name: lenght
dtype: int64
splits:
- name: train
num_bytes: 48699691541
num_examples: 34994861
download_size: 18591747965
dataset_size: 48699691541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_champs_solutions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_70 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1022119912
num_examples: 199166
download_size: 1043945674
dataset_size: 1022119912
---
# Dataset Card for "chunk_70"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yeniceriSGK/TinyLlamaDatasetSample1 | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19997
num_examples: 10
download_size: 21157
dataset_size: 19997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-high_school_world_history-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 9749
num_examples: 5
download_size: 0
dataset_size: 9749
---
# Dataset Card for "mmlu-high_school_world_history-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b7d6780d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1332
dataset_size: 182
---
# Dataset Card for "b7d6780d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShoejustCR/cosmetics_knowledge | ---
license: llama2
---
https://huggingface.co/datasets/ShoejustCR/cosmetics_knowledge
|
GrantC/tinierstories | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: story
dtype: string
splits:
- name: train
num_bytes: 699295
num_examples: 550
download_size: 257397
dataset_size: 699295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_allknowingroger__Neurallaymons-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/Neurallaymons-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/Neurallaymons-7B-slerp](https://huggingface.co/allknowingroger/Neurallaymons-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__Neurallaymons-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T21:49:05.731032](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Neurallaymons-7B-slerp/blob/main/results_2024-04-10T21-49-05.731032.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6604574479964851,\n\
\ \"acc_stderr\": 0.0318115287225691,\n \"acc_norm\": 0.6606315264458453,\n\
\ \"acc_norm_stderr\": 0.03246687979365843,\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6444622727555894,\n\
\ \"mc2_stderr\": 0.014905695944552787\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441372,\n\
\ \"acc_norm\": 0.6996587030716723,\n \"acc_norm_stderr\": 0.013395909309957009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6814379605656243,\n\
\ \"acc_stderr\": 0.004649665273890646,\n \"acc_norm\": 0.8685520812587134,\n\
\ \"acc_norm_stderr\": 0.0033719902188524588\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700481,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700481\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741619,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741619\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.0162690886639594,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.0162690886639594\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6444622727555894,\n\
\ \"mc2_stderr\": 0.014905695944552787\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007482\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337687\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/Neurallaymons-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-49-05.731032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T21-49-05.731032.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- '**/details_harness|winogrande|5_2024-04-10T21-49-05.731032.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T21-49-05.731032.parquet'
- config_name: results
data_files:
- split: 2024_04_10T21_49_05.731032
path:
- results_2024-04-10T21-49-05.731032.parquet
- split: latest
path:
- results_2024-04-10T21-49-05.731032.parquet
---
# Dataset Card for Evaluation run of allknowingroger/Neurallaymons-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/Neurallaymons-7B-slerp](https://huggingface.co/allknowingroger/Neurallaymons-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__Neurallaymons-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T21:49:05.731032](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Neurallaymons-7B-slerp/blob/main/results_2024-04-10T21-49-05.731032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6604574479964851,
"acc_stderr": 0.0318115287225691,
"acc_norm": 0.6606315264458453,
"acc_norm_stderr": 0.03246687979365843,
"mc1": 0.47123623011015914,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6444622727555894,
"mc2_stderr": 0.014905695944552787
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441372,
"acc_norm": 0.6996587030716723,
"acc_norm_stderr": 0.013395909309957009
},
"harness|hellaswag|10": {
"acc": 0.6814379605656243,
"acc_stderr": 0.004649665273890646,
"acc_norm": 0.8685520812587134,
"acc_norm_stderr": 0.0033719902188524588
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700481,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700481
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741619,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741619
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.0162690886639594,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.0162690886639594
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083138,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47123623011015914,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6444622727555894,
"mc2_stderr": 0.014905695944552787
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.010833276515007482
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337687
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PritiLohra/orca_paragraphs | ---
license: mit
---
|
AyoubChLin/CNN_News_Articles_clean | ---
license: apache-2.0
---
|
biodatlab/whisper-th-custom | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 24293230034.95
num_examples: 601854
download_size: 35844557183
dataset_size: 24293230034.95
---
# Dataset Card for "whisper-th-custom"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NickyNicky/aya_dataset_multilingual_chatml_gemma | ---
dataset_info:
features:
- name: text
dtype: string
- name: len_tokens
dtype: int64
splits:
- name: train
num_bytes: 105948864
num_examples: 134977
download_size: 22162959
dataset_size: 105948864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
datasets:
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext1
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext2
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext3
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext4
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext5
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext6
# - NickyNicky/aya_dataset_multilingual_inputs_targets_ext7
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext8
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext9
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext10
language:
- es
- fr
- en
- de
---
#tokenizer: google/gemma-2b-it
```
datasets:
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext1
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext2
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext3
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext4
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext5
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext6
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext8
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext9
- NickyNicky/aya_dataset_multilingual_inputs_targets_ext10
```
```
# FORMAT CHATML EXAMPLE
<bos><start_of_turn>system
You are a helpful AI assistant.
lista de codigos linguisticos disponibles: ["fr", "es"].<end_of_turn>
<start_of_turn>user
Donnez-moi un exemple de quiz dans cette catégorie : les livres.<end_of_turn>
<start_of_turn>model
¿Quién escribió : El Señor de los Anillos? <end_of_turn><eos>
```
# hist len_tokens

# describe.

# percentil.

|
Felipe474/nilc-coraa-v1 | ---
license: other
---
### CORAA V1 - Dataset
CORAA is a publicly available dataset for Automatic Speech Recognition (ASR) in the Brazilian Portuguese language containing 290.77 hours of audios and their respective transcriptions (400k+ segmented audios). The dataset is composed of audios of 5 original projects:
* ALIP (Gonçalves, 2019)
* C-ORAL Brazil (Raso and Mello, 2012)
* NURC-Recife (Oliviera Jr., 2016)
* SP-2010 (Mendes and Oushiro, 2012)
* TEDx talks (talks in Portuguese)
The audios were either validated by annotators or transcripted for the first time aiming at the ASR task.
<br>
### References
* Gonçalves SCL (2019) Projeto ALIP (amostra linguística do interior paulista) e banco de dados iboruna: 10 anos de contribuição com a descrição do Português Brasileiro. Revista Estudos Linguísticos 48(1):276–297.
* Raso T, Mello H, Mittmann MM (2012) The C-ORAL-BRASIL I: Reference corpus for spoken Brazilian Portuguese. In: Proceedings of the Eighth International Conference on Language Resources and Evaluation (LREC’12), European Language Resources Association (ELRA), Istanbul, Turkey, pp 106–113, URL http://www.lrec-conf.org/proceedings/lrec2012/pdf/624_Paper.pdf
* Oliviera Jr M (2016) Nurc digital um protocolo para a digitalização, anotação, arquivamento e disseminação do material do projeto da norma urbana linguística culta (NURC). CHIMERA: Revista de Corpus de Lenguas Romances y Estudios Linguísticos 3(2):149–174, URL https://revistas.uam.es/chimera/article/view/6519
* Mendes RB, Oushiro L (2012) Mapping Paulistano Portuguese: the SP2010 Project. In: Proceedings of the VIIth GSCP International Conference: Speech and Corpora, Fizenze University Press, Firenze, Italy, pp 459–463. |
distilled-from-one-sec-cv12/chunk_145 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1058043912
num_examples: 206166
download_size: 1082319418
dataset_size: 1058043912
---
# Dataset Card for "chunk_145"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_T_A_C_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_10
num_bytes: 8715503
num_examples: 1000
- name: fewshot_0
num_bytes: 821592
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 1123333
num_examples: 1000
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 1141686
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 1120437
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 1298358
num_examples: 1000
download_size: 2339668
dataset_size: 14220909
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_T_A_C_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
susnato/testing1_v1_features | ---
dataset_info:
features:
- name: query_emb_0
dtype: float64
- name: query_emb_1
dtype: float64
- name: query_emb_2
dtype: float64
- name: query_emb_3
dtype: float64
- name: query_emb_4
dtype: float64
- name: query_emb_5
dtype: float64
- name: query_emb_6
dtype: float64
- name: query_emb_7
dtype: float64
- name: query_emb_8
dtype: float64
- name: query_emb_9
dtype: float64
- name: query_emb_10
dtype: float64
- name: query_emb_11
dtype: float64
- name: query_emb_12
dtype: float64
- name: query_emb_13
dtype: float64
- name: query_emb_14
dtype: float64
- name: query_emb_15
dtype: float64
- name: query_emb_16
dtype: float64
- name: query_emb_17
dtype: float64
- name: query_emb_18
dtype: float64
- name: query_emb_19
dtype: float64
- name: query_emb_20
dtype: float64
- name: query_emb_21
dtype: float64
- name: query_emb_22
dtype: float64
- name: query_emb_23
dtype: float64
- name: query_emb_24
dtype: float64
- name: query_emb_25
dtype: float64
- name: query_emb_26
dtype: float64
- name: query_emb_27
dtype: float64
- name: query_emb_28
dtype: float64
- name: query_emb_29
dtype: float64
- name: query_emb_30
dtype: float64
- name: query_emb_31
dtype: float64
- name: query_emb_32
dtype: float64
- name: query_emb_33
dtype: float64
- name: query_emb_34
dtype: float64
- name: query_emb_35
dtype: float64
- name: query_emb_36
dtype: float64
- name: query_emb_37
dtype: float64
- name: query_emb_38
dtype: float64
- name: query_emb_39
dtype: float64
- name: query_emb_40
dtype: float64
- name: query_emb_41
dtype: float64
- name: query_emb_42
dtype: float64
- name: query_emb_43
dtype: float64
- name: query_emb_44
dtype: float64
- name: query_emb_45
dtype: float64
- name: query_emb_46
dtype: float64
- name: query_emb_47
dtype: float64
- name: query_emb_48
dtype: float64
- name: query_emb_49
dtype: float64
- name: query_emb_50
dtype: float64
- name: query_emb_51
dtype: float64
- name: query_emb_52
dtype: float64
- name: query_emb_53
dtype: float64
- name: query_emb_54
dtype: float64
- name: query_emb_55
dtype: float64
- name: query_emb_56
dtype: float64
- name: query_emb_57
dtype: float64
- name: query_emb_58
dtype: float64
- name: query_emb_59
dtype: float64
- name: query_emb_60
dtype: float64
- name: query_emb_61
dtype: float64
- name: query_emb_62
dtype: float64
- name: query_emb_63
dtype: float64
- name: query_emb_64
dtype: float64
- name: query_emb_65
dtype: float64
- name: query_emb_66
dtype: float64
- name: query_emb_67
dtype: float64
- name: query_emb_68
dtype: float64
- name: query_emb_69
dtype: float64
- name: query_emb_70
dtype: float64
- name: query_emb_71
dtype: float64
- name: query_emb_72
dtype: float64
- name: query_emb_73
dtype: float64
- name: query_emb_74
dtype: float64
- name: query_emb_75
dtype: float64
- name: query_emb_76
dtype: float64
- name: query_emb_77
dtype: float64
- name: query_emb_78
dtype: float64
- name: query_emb_79
dtype: float64
- name: query_emb_80
dtype: float64
- name: query_emb_81
dtype: float64
- name: query_emb_82
dtype: float64
- name: query_emb_83
dtype: float64
- name: query_emb_84
dtype: float64
- name: query_emb_85
dtype: float64
- name: query_emb_86
dtype: float64
- name: query_emb_87
dtype: float64
- name: query_emb_88
dtype: float64
- name: query_emb_89
dtype: float64
- name: query_emb_90
dtype: float64
- name: query_emb_91
dtype: float64
- name: query_emb_92
dtype: float64
- name: query_emb_93
dtype: float64
- name: query_emb_94
dtype: float64
- name: query_emb_95
dtype: float64
- name: query_emb_96
dtype: float64
- name: query_emb_97
dtype: float64
- name: query_emb_98
dtype: float64
- name: query_emb_99
dtype: float64
- name: query_emb_100
dtype: float64
- name: query_emb_101
dtype: float64
- name: query_emb_102
dtype: float64
- name: query_emb_103
dtype: float64
- name: query_emb_104
dtype: float64
- name: query_emb_105
dtype: float64
- name: query_emb_106
dtype: float64
- name: query_emb_107
dtype: float64
- name: query_emb_108
dtype: float64
- name: query_emb_109
dtype: float64
- name: query_emb_110
dtype: float64
- name: query_emb_111
dtype: float64
- name: query_emb_112
dtype: float64
- name: query_emb_113
dtype: float64
- name: query_emb_114
dtype: float64
- name: query_emb_115
dtype: float64
- name: query_emb_116
dtype: float64
- name: query_emb_117
dtype: float64
- name: query_emb_118
dtype: float64
- name: query_emb_119
dtype: float64
- name: query_emb_120
dtype: float64
- name: query_emb_121
dtype: float64
- name: query_emb_122
dtype: float64
- name: query_emb_123
dtype: float64
- name: query_emb_124
dtype: float64
- name: query_emb_125
dtype: float64
- name: query_emb_126
dtype: float64
- name: query_emb_127
dtype: float64
- name: query_emb_128
dtype: float64
- name: query_emb_129
dtype: float64
- name: query_emb_130
dtype: float64
- name: query_emb_131
dtype: float64
- name: query_emb_132
dtype: float64
- name: query_emb_133
dtype: float64
- name: query_emb_134
dtype: float64
- name: query_emb_135
dtype: float64
- name: query_emb_136
dtype: float64
- name: query_emb_137
dtype: float64
- name: query_emb_138
dtype: float64
- name: query_emb_139
dtype: float64
- name: query_emb_140
dtype: float64
- name: query_emb_141
dtype: float64
- name: query_emb_142
dtype: float64
- name: query_emb_143
dtype: float64
- name: query_emb_144
dtype: float64
- name: query_emb_145
dtype: float64
- name: query_emb_146
dtype: float64
- name: query_emb_147
dtype: float64
- name: query_emb_148
dtype: float64
- name: query_emb_149
dtype: float64
- name: query_emb_150
dtype: float64
- name: query_emb_151
dtype: float64
- name: query_emb_152
dtype: float64
- name: query_emb_153
dtype: float64
- name: query_emb_154
dtype: float64
- name: query_emb_155
dtype: float64
- name: query_emb_156
dtype: float64
- name: query_emb_157
dtype: float64
- name: query_emb_158
dtype: float64
- name: query_emb_159
dtype: float64
- name: query_emb_160
dtype: float64
- name: query_emb_161
dtype: float64
- name: query_emb_162
dtype: float64
- name: query_emb_163
dtype: float64
- name: query_emb_164
dtype: float64
- name: query_emb_165
dtype: float64
- name: query_emb_166
dtype: float64
- name: query_emb_167
dtype: float64
- name: query_emb_168
dtype: float64
- name: query_emb_169
dtype: float64
- name: query_emb_170
dtype: float64
- name: query_emb_171
dtype: float64
- name: query_emb_172
dtype: float64
- name: query_emb_173
dtype: float64
- name: query_emb_174
dtype: float64
- name: query_emb_175
dtype: float64
- name: query_emb_176
dtype: float64
- name: query_emb_177
dtype: float64
- name: query_emb_178
dtype: float64
- name: query_emb_179
dtype: float64
- name: query_emb_180
dtype: float64
- name: query_emb_181
dtype: float64
- name: query_emb_182
dtype: float64
- name: query_emb_183
dtype: float64
- name: query_emb_184
dtype: float64
- name: query_emb_185
dtype: float64
- name: query_emb_186
dtype: float64
- name: query_emb_187
dtype: float64
- name: query_emb_188
dtype: float64
- name: query_emb_189
dtype: float64
- name: query_emb_190
dtype: float64
- name: query_emb_191
dtype: float64
- name: query_emb_192
dtype: float64
- name: query_emb_193
dtype: float64
- name: query_emb_194
dtype: float64
- name: query_emb_195
dtype: float64
- name: query_emb_196
dtype: float64
- name: query_emb_197
dtype: float64
- name: query_emb_198
dtype: float64
- name: query_emb_199
dtype: float64
- name: query_emb_200
dtype: float64
- name: query_emb_201
dtype: float64
- name: query_emb_202
dtype: float64
- name: query_emb_203
dtype: float64
- name: query_emb_204
dtype: float64
- name: query_emb_205
dtype: float64
- name: query_emb_206
dtype: float64
- name: query_emb_207
dtype: float64
- name: query_emb_208
dtype: float64
- name: query_emb_209
dtype: float64
- name: query_emb_210
dtype: float64
- name: query_emb_211
dtype: float64
- name: query_emb_212
dtype: float64
- name: query_emb_213
dtype: float64
- name: query_emb_214
dtype: float64
- name: query_emb_215
dtype: float64
- name: query_emb_216
dtype: float64
- name: query_emb_217
dtype: float64
- name: query_emb_218
dtype: float64
- name: query_emb_219
dtype: float64
- name: query_emb_220
dtype: float64
- name: query_emb_221
dtype: float64
- name: query_emb_222
dtype: float64
- name: query_emb_223
dtype: float64
- name: query_emb_224
dtype: float64
- name: query_emb_225
dtype: float64
- name: query_emb_226
dtype: float64
- name: query_emb_227
dtype: float64
- name: query_emb_228
dtype: float64
- name: query_emb_229
dtype: float64
- name: query_emb_230
dtype: float64
- name: query_emb_231
dtype: float64
- name: query_emb_232
dtype: float64
- name: query_emb_233
dtype: float64
- name: query_emb_234
dtype: float64
- name: query_emb_235
dtype: float64
- name: query_emb_236
dtype: float64
- name: query_emb_237
dtype: float64
- name: query_emb_238
dtype: float64
- name: query_emb_239
dtype: float64
- name: query_emb_240
dtype: float64
- name: query_emb_241
dtype: float64
- name: query_emb_242
dtype: float64
- name: query_emb_243
dtype: float64
- name: query_emb_244
dtype: float64
- name: query_emb_245
dtype: float64
- name: query_emb_246
dtype: float64
- name: query_emb_247
dtype: float64
- name: query_emb_248
dtype: float64
- name: query_emb_249
dtype: float64
- name: query_emb_250
dtype: float64
- name: query_emb_251
dtype: float64
- name: query_emb_252
dtype: float64
- name: query_emb_253
dtype: float64
- name: query_emb_254
dtype: float64
- name: query_emb_255
dtype: float64
- name: query_emb_256
dtype: float64
- name: query_emb_257
dtype: float64
- name: query_emb_258
dtype: float64
- name: query_emb_259
dtype: float64
- name: query_emb_260
dtype: float64
- name: query_emb_261
dtype: float64
- name: query_emb_262
dtype: float64
- name: query_emb_263
dtype: float64
- name: query_emb_264
dtype: float64
- name: query_emb_265
dtype: float64
- name: query_emb_266
dtype: float64
- name: query_emb_267
dtype: float64
- name: query_emb_268
dtype: float64
- name: query_emb_269
dtype: float64
- name: query_emb_270
dtype: float64
- name: query_emb_271
dtype: float64
- name: query_emb_272
dtype: float64
- name: query_emb_273
dtype: float64
- name: query_emb_274
dtype: float64
- name: query_emb_275
dtype: float64
- name: query_emb_276
dtype: float64
- name: query_emb_277
dtype: float64
- name: query_emb_278
dtype: float64
- name: query_emb_279
dtype: float64
- name: query_emb_280
dtype: float64
- name: query_emb_281
dtype: float64
- name: query_emb_282
dtype: float64
- name: query_emb_283
dtype: float64
- name: query_emb_284
dtype: float64
- name: query_emb_285
dtype: float64
- name: query_emb_286
dtype: float64
- name: query_emb_287
dtype: float64
- name: query_emb_288
dtype: float64
- name: query_emb_289
dtype: float64
- name: query_emb_290
dtype: float64
- name: query_emb_291
dtype: float64
- name: query_emb_292
dtype: float64
- name: query_emb_293
dtype: float64
- name: query_emb_294
dtype: float64
- name: query_emb_295
dtype: float64
- name: query_emb_296
dtype: float64
- name: query_emb_297
dtype: float64
- name: query_emb_298
dtype: float64
- name: query_emb_299
dtype: float64
- name: query_emb_300
dtype: float64
- name: query_emb_301
dtype: float64
- name: query_emb_302
dtype: float64
- name: query_emb_303
dtype: float64
- name: query_emb_304
dtype: float64
- name: query_emb_305
dtype: float64
- name: query_emb_306
dtype: float64
- name: query_emb_307
dtype: float64
- name: query_emb_308
dtype: float64
- name: query_emb_309
dtype: float64
- name: query_emb_310
dtype: float64
- name: query_emb_311
dtype: float64
- name: query_emb_312
dtype: float64
- name: query_emb_313
dtype: float64
- name: query_emb_314
dtype: float64
- name: query_emb_315
dtype: float64
- name: query_emb_316
dtype: float64
- name: query_emb_317
dtype: float64
- name: query_emb_318
dtype: float64
- name: query_emb_319
dtype: float64
- name: query_emb_320
dtype: float64
- name: query_emb_321
dtype: float64
- name: query_emb_322
dtype: float64
- name: query_emb_323
dtype: float64
- name: query_emb_324
dtype: float64
- name: query_emb_325
dtype: float64
- name: query_emb_326
dtype: float64
- name: query_emb_327
dtype: float64
- name: query_emb_328
dtype: float64
- name: query_emb_329
dtype: float64
- name: query_emb_330
dtype: float64
- name: query_emb_331
dtype: float64
- name: query_emb_332
dtype: float64
- name: query_emb_333
dtype: float64
- name: query_emb_334
dtype: float64
- name: query_emb_335
dtype: float64
- name: query_emb_336
dtype: float64
- name: query_emb_337
dtype: float64
- name: query_emb_338
dtype: float64
- name: query_emb_339
dtype: float64
- name: query_emb_340
dtype: float64
- name: query_emb_341
dtype: float64
- name: query_emb_342
dtype: float64
- name: query_emb_343
dtype: float64
- name: query_emb_344
dtype: float64
- name: query_emb_345
dtype: float64
- name: query_emb_346
dtype: float64
- name: query_emb_347
dtype: float64
- name: query_emb_348
dtype: float64
- name: query_emb_349
dtype: float64
- name: query_emb_350
dtype: float64
- name: query_emb_351
dtype: float64
- name: query_emb_352
dtype: float64
- name: query_emb_353
dtype: float64
- name: query_emb_354
dtype: float64
- name: query_emb_355
dtype: float64
- name: query_emb_356
dtype: float64
- name: query_emb_357
dtype: float64
- name: query_emb_358
dtype: float64
- name: query_emb_359
dtype: float64
- name: query_emb_360
dtype: float64
- name: query_emb_361
dtype: float64
- name: query_emb_362
dtype: float64
- name: query_emb_363
dtype: float64
- name: query_emb_364
dtype: float64
- name: query_emb_365
dtype: float64
- name: query_emb_366
dtype: float64
- name: query_emb_367
dtype: float64
- name: query_emb_368
dtype: float64
- name: query_emb_369
dtype: float64
- name: query_emb_370
dtype: float64
- name: query_emb_371
dtype: float64
- name: query_emb_372
dtype: float64
- name: query_emb_373
dtype: float64
- name: query_emb_374
dtype: float64
- name: query_emb_375
dtype: float64
- name: query_emb_376
dtype: float64
- name: query_emb_377
dtype: float64
- name: query_emb_378
dtype: float64
- name: query_emb_379
dtype: float64
- name: query_emb_380
dtype: float64
- name: query_emb_381
dtype: float64
- name: query_emb_382
dtype: float64
- name: query_emb_383
dtype: float64
- name: query_emb_384
dtype: float64
- name: query_emb_385
dtype: float64
- name: query_emb_386
dtype: float64
- name: query_emb_387
dtype: float64
- name: query_emb_388
dtype: float64
- name: query_emb_389
dtype: float64
- name: query_emb_390
dtype: float64
- name: query_emb_391
dtype: float64
- name: query_emb_392
dtype: float64
- name: query_emb_393
dtype: float64
- name: query_emb_394
dtype: float64
- name: query_emb_395
dtype: float64
- name: query_emb_396
dtype: float64
- name: query_emb_397
dtype: float64
- name: query_emb_398
dtype: float64
- name: query_emb_399
dtype: float64
- name: query_emb_400
dtype: float64
- name: query_emb_401
dtype: float64
- name: query_emb_402
dtype: float64
- name: query_emb_403
dtype: float64
- name: query_emb_404
dtype: float64
- name: query_emb_405
dtype: float64
- name: query_emb_406
dtype: float64
- name: query_emb_407
dtype: float64
- name: query_emb_408
dtype: float64
- name: query_emb_409
dtype: float64
- name: query_emb_410
dtype: float64
- name: query_emb_411
dtype: float64
- name: query_emb_412
dtype: float64
- name: query_emb_413
dtype: float64
- name: query_emb_414
dtype: float64
- name: query_emb_415
dtype: float64
- name: query_emb_416
dtype: float64
- name: query_emb_417
dtype: float64
- name: query_emb_418
dtype: float64
- name: query_emb_419
dtype: float64
- name: query_emb_420
dtype: float64
- name: query_emb_421
dtype: float64
- name: query_emb_422
dtype: float64
- name: query_emb_423
dtype: float64
- name: query_emb_424
dtype: float64
- name: query_emb_425
dtype: float64
- name: query_emb_426
dtype: float64
- name: query_emb_427
dtype: float64
- name: query_emb_428
dtype: float64
- name: query_emb_429
dtype: float64
- name: query_emb_430
dtype: float64
- name: query_emb_431
dtype: float64
- name: query_emb_432
dtype: float64
- name: query_emb_433
dtype: float64
- name: query_emb_434
dtype: float64
- name: query_emb_435
dtype: float64
- name: query_emb_436
dtype: float64
- name: query_emb_437
dtype: float64
- name: query_emb_438
dtype: float64
- name: query_emb_439
dtype: float64
- name: query_emb_440
dtype: float64
- name: query_emb_441
dtype: float64
- name: query_emb_442
dtype: float64
- name: query_emb_443
dtype: float64
- name: query_emb_444
dtype: float64
- name: query_emb_445
dtype: float64
- name: query_emb_446
dtype: float64
- name: query_emb_447
dtype: float64
- name: query_emb_448
dtype: float64
- name: query_emb_449
dtype: float64
- name: query_emb_450
dtype: float64
- name: query_emb_451
dtype: float64
- name: query_emb_452
dtype: float64
- name: query_emb_453
dtype: float64
- name: query_emb_454
dtype: float64
- name: query_emb_455
dtype: float64
- name: query_emb_456
dtype: float64
- name: query_emb_457
dtype: float64
- name: query_emb_458
dtype: float64
- name: query_emb_459
dtype: float64
- name: query_emb_460
dtype: float64
- name: query_emb_461
dtype: float64
- name: query_emb_462
dtype: float64
- name: query_emb_463
dtype: float64
- name: query_emb_464
dtype: float64
- name: query_emb_465
dtype: float64
- name: query_emb_466
dtype: float64
- name: query_emb_467
dtype: float64
- name: query_emb_468
dtype: float64
- name: query_emb_469
dtype: float64
- name: query_emb_470
dtype: float64
- name: query_emb_471
dtype: float64
- name: query_emb_472
dtype: float64
- name: query_emb_473
dtype: float64
- name: query_emb_474
dtype: float64
- name: query_emb_475
dtype: float64
- name: query_emb_476
dtype: float64
- name: query_emb_477
dtype: float64
- name: query_emb_478
dtype: float64
- name: query_emb_479
dtype: float64
- name: query_emb_480
dtype: float64
- name: query_emb_481
dtype: float64
- name: query_emb_482
dtype: float64
- name: query_emb_483
dtype: float64
- name: query_emb_484
dtype: float64
- name: query_emb_485
dtype: float64
- name: query_emb_486
dtype: float64
- name: query_emb_487
dtype: float64
- name: query_emb_488
dtype: float64
- name: query_emb_489
dtype: float64
- name: query_emb_490
dtype: float64
- name: query_emb_491
dtype: float64
- name: query_emb_492
dtype: float64
- name: query_emb_493
dtype: float64
- name: query_emb_494
dtype: float64
- name: query_emb_495
dtype: float64
- name: query_emb_496
dtype: float64
- name: query_emb_497
dtype: float64
- name: query_emb_498
dtype: float64
- name: query_emb_499
dtype: float64
- name: query_emb_500
dtype: float64
- name: query_emb_501
dtype: float64
- name: query_emb_502
dtype: float64
- name: query_emb_503
dtype: float64
- name: query_emb_504
dtype: float64
- name: query_emb_505
dtype: float64
- name: query_emb_506
dtype: float64
- name: query_emb_507
dtype: float64
- name: query_emb_508
dtype: float64
- name: query_emb_509
dtype: float64
- name: query_emb_510
dtype: float64
- name: query_emb_511
dtype: float64
- name: query_emb_512
dtype: float64
- name: query_emb_513
dtype: float64
- name: query_emb_514
dtype: float64
- name: query_emb_515
dtype: float64
- name: query_emb_516
dtype: float64
- name: query_emb_517
dtype: float64
- name: query_emb_518
dtype: float64
- name: query_emb_519
dtype: float64
- name: query_emb_520
dtype: float64
- name: query_emb_521
dtype: float64
- name: query_emb_522
dtype: float64
- name: query_emb_523
dtype: float64
- name: query_emb_524
dtype: float64
- name: query_emb_525
dtype: float64
- name: query_emb_526
dtype: float64
- name: query_emb_527
dtype: float64
- name: query_emb_528
dtype: float64
- name: query_emb_529
dtype: float64
- name: query_emb_530
dtype: float64
- name: query_emb_531
dtype: float64
- name: query_emb_532
dtype: float64
- name: query_emb_533
dtype: float64
- name: query_emb_534
dtype: float64
- name: query_emb_535
dtype: float64
- name: query_emb_536
dtype: float64
- name: query_emb_537
dtype: float64
- name: query_emb_538
dtype: float64
- name: query_emb_539
dtype: float64
- name: query_emb_540
dtype: float64
- name: query_emb_541
dtype: float64
- name: query_emb_542
dtype: float64
- name: query_emb_543
dtype: float64
- name: query_emb_544
dtype: float64
- name: query_emb_545
dtype: float64
- name: query_emb_546
dtype: float64
- name: query_emb_547
dtype: float64
- name: query_emb_548
dtype: float64
- name: query_emb_549
dtype: float64
- name: query_emb_550
dtype: float64
- name: query_emb_551
dtype: float64
- name: query_emb_552
dtype: float64
- name: query_emb_553
dtype: float64
- name: query_emb_554
dtype: float64
- name: query_emb_555
dtype: float64
- name: query_emb_556
dtype: float64
- name: query_emb_557
dtype: float64
- name: query_emb_558
dtype: float64
- name: query_emb_559
dtype: float64
- name: query_emb_560
dtype: float64
- name: query_emb_561
dtype: float64
- name: query_emb_562
dtype: float64
- name: query_emb_563
dtype: float64
- name: query_emb_564
dtype: float64
- name: query_emb_565
dtype: float64
- name: query_emb_566
dtype: float64
- name: query_emb_567
dtype: float64
- name: query_emb_568
dtype: float64
- name: query_emb_569
dtype: float64
- name: query_emb_570
dtype: float64
- name: query_emb_571
dtype: float64
- name: query_emb_572
dtype: float64
- name: query_emb_573
dtype: float64
- name: query_emb_574
dtype: float64
- name: query_emb_575
dtype: float64
- name: query_emb_576
dtype: float64
- name: query_emb_577
dtype: float64
- name: query_emb_578
dtype: float64
- name: query_emb_579
dtype: float64
- name: query_emb_580
dtype: float64
- name: query_emb_581
dtype: float64
- name: query_emb_582
dtype: float64
- name: query_emb_583
dtype: float64
- name: query_emb_584
dtype: float64
- name: query_emb_585
dtype: float64
- name: query_emb_586
dtype: float64
- name: query_emb_587
dtype: float64
- name: query_emb_588
dtype: float64
- name: query_emb_589
dtype: float64
- name: query_emb_590
dtype: float64
- name: query_emb_591
dtype: float64
- name: query_emb_592
dtype: float64
- name: query_emb_593
dtype: float64
- name: query_emb_594
dtype: float64
- name: query_emb_595
dtype: float64
- name: query_emb_596
dtype: float64
- name: query_emb_597
dtype: float64
- name: query_emb_598
dtype: float64
- name: query_emb_599
dtype: float64
- name: query_emb_600
dtype: float64
- name: query_emb_601
dtype: float64
- name: query_emb_602
dtype: float64
- name: query_emb_603
dtype: float64
- name: query_emb_604
dtype: float64
- name: query_emb_605
dtype: float64
- name: query_emb_606
dtype: float64
- name: query_emb_607
dtype: float64
- name: query_emb_608
dtype: float64
- name: query_emb_609
dtype: float64
- name: query_emb_610
dtype: float64
- name: query_emb_611
dtype: float64
- name: query_emb_612
dtype: float64
- name: query_emb_613
dtype: float64
- name: query_emb_614
dtype: float64
- name: query_emb_615
dtype: float64
- name: query_emb_616
dtype: float64
- name: query_emb_617
dtype: float64
- name: query_emb_618
dtype: float64
- name: query_emb_619
dtype: float64
- name: query_emb_620
dtype: float64
- name: query_emb_621
dtype: float64
- name: query_emb_622
dtype: float64
- name: query_emb_623
dtype: float64
- name: query_emb_624
dtype: float64
- name: query_emb_625
dtype: float64
- name: query_emb_626
dtype: float64
- name: query_emb_627
dtype: float64
- name: query_emb_628
dtype: float64
- name: query_emb_629
dtype: float64
- name: query_emb_630
dtype: float64
- name: query_emb_631
dtype: float64
- name: query_emb_632
dtype: float64
- name: query_emb_633
dtype: float64
- name: query_emb_634
dtype: float64
- name: query_emb_635
dtype: float64
- name: query_emb_636
dtype: float64
- name: query_emb_637
dtype: float64
- name: query_emb_638
dtype: float64
- name: query_emb_639
dtype: float64
- name: query_emb_640
dtype: float64
- name: query_emb_641
dtype: float64
- name: query_emb_642
dtype: float64
- name: query_emb_643
dtype: float64
- name: query_emb_644
dtype: float64
- name: query_emb_645
dtype: float64
- name: query_emb_646
dtype: float64
- name: query_emb_647
dtype: float64
- name: query_emb_648
dtype: float64
- name: query_emb_649
dtype: float64
- name: query_emb_650
dtype: float64
- name: query_emb_651
dtype: float64
- name: query_emb_652
dtype: float64
- name: query_emb_653
dtype: float64
- name: query_emb_654
dtype: float64
- name: query_emb_655
dtype: float64
- name: query_emb_656
dtype: float64
- name: query_emb_657
dtype: float64
- name: query_emb_658
dtype: float64
- name: query_emb_659
dtype: float64
- name: query_emb_660
dtype: float64
- name: query_emb_661
dtype: float64
- name: query_emb_662
dtype: float64
- name: query_emb_663
dtype: float64
- name: query_emb_664
dtype: float64
- name: query_emb_665
dtype: float64
- name: query_emb_666
dtype: float64
- name: query_emb_667
dtype: float64
- name: query_emb_668
dtype: float64
- name: query_emb_669
dtype: float64
- name: query_emb_670
dtype: float64
- name: query_emb_671
dtype: float64
- name: query_emb_672
dtype: float64
- name: query_emb_673
dtype: float64
- name: query_emb_674
dtype: float64
- name: query_emb_675
dtype: float64
- name: query_emb_676
dtype: float64
- name: query_emb_677
dtype: float64
- name: query_emb_678
dtype: float64
- name: query_emb_679
dtype: float64
- name: query_emb_680
dtype: float64
- name: query_emb_681
dtype: float64
- name: query_emb_682
dtype: float64
- name: query_emb_683
dtype: float64
- name: query_emb_684
dtype: float64
- name: query_emb_685
dtype: float64
- name: query_emb_686
dtype: float64
- name: query_emb_687
dtype: float64
- name: query_emb_688
dtype: float64
- name: query_emb_689
dtype: float64
- name: query_emb_690
dtype: float64
- name: query_emb_691
dtype: float64
- name: query_emb_692
dtype: float64
- name: query_emb_693
dtype: float64
- name: query_emb_694
dtype: float64
- name: query_emb_695
dtype: float64
- name: query_emb_696
dtype: float64
- name: query_emb_697
dtype: float64
- name: query_emb_698
dtype: float64
- name: query_emb_699
dtype: float64
- name: query_emb_700
dtype: float64
- name: query_emb_701
dtype: float64
- name: query_emb_702
dtype: float64
- name: query_emb_703
dtype: float64
- name: query_emb_704
dtype: float64
- name: query_emb_705
dtype: float64
- name: query_emb_706
dtype: float64
- name: query_emb_707
dtype: float64
- name: query_emb_708
dtype: float64
- name: query_emb_709
dtype: float64
- name: query_emb_710
dtype: float64
- name: query_emb_711
dtype: float64
- name: query_emb_712
dtype: float64
- name: query_emb_713
dtype: float64
- name: query_emb_714
dtype: float64
- name: query_emb_715
dtype: float64
- name: query_emb_716
dtype: float64
- name: query_emb_717
dtype: float64
- name: query_emb_718
dtype: float64
- name: query_emb_719
dtype: float64
- name: query_emb_720
dtype: float64
- name: query_emb_721
dtype: float64
- name: query_emb_722
dtype: float64
- name: query_emb_723
dtype: float64
- name: query_emb_724
dtype: float64
- name: query_emb_725
dtype: float64
- name: query_emb_726
dtype: float64
- name: query_emb_727
dtype: float64
- name: query_emb_728
dtype: float64
- name: query_emb_729
dtype: float64
- name: query_emb_730
dtype: float64
- name: query_emb_731
dtype: float64
- name: query_emb_732
dtype: float64
- name: query_emb_733
dtype: float64
- name: query_emb_734
dtype: float64
- name: query_emb_735
dtype: float64
- name: query_emb_736
dtype: float64
- name: query_emb_737
dtype: float64
- name: query_emb_738
dtype: float64
- name: query_emb_739
dtype: float64
- name: query_emb_740
dtype: float64
- name: query_emb_741
dtype: float64
- name: query_emb_742
dtype: float64
- name: query_emb_743
dtype: float64
- name: query_emb_744
dtype: float64
- name: query_emb_745
dtype: float64
- name: query_emb_746
dtype: float64
- name: query_emb_747
dtype: float64
- name: query_emb_748
dtype: float64
- name: query_emb_749
dtype: float64
- name: query_emb_750
dtype: float64
- name: query_emb_751
dtype: float64
- name: query_emb_752
dtype: float64
- name: query_emb_753
dtype: float64
- name: query_emb_754
dtype: float64
- name: query_emb_755
dtype: float64
- name: query_emb_756
dtype: float64
- name: query_emb_757
dtype: float64
- name: query_emb_758
dtype: float64
- name: query_emb_759
dtype: float64
- name: query_emb_760
dtype: float64
- name: query_emb_761
dtype: float64
- name: query_emb_762
dtype: float64
- name: query_emb_763
dtype: float64
- name: query_emb_764
dtype: float64
- name: query_emb_765
dtype: float64
- name: query_emb_766
dtype: float64
- name: query_emb_767
dtype: float64
- name: context_emb_0
dtype: float64
- name: context_emb_1
dtype: float64
- name: context_emb_2
dtype: float64
- name: context_emb_3
dtype: float64
- name: context_emb_4
dtype: float64
- name: context_emb_5
dtype: float64
- name: context_emb_6
dtype: float64
- name: context_emb_7
dtype: float64
- name: context_emb_8
dtype: float64
- name: context_emb_9
dtype: float64
- name: context_emb_10
dtype: float64
- name: context_emb_11
dtype: float64
- name: context_emb_12
dtype: float64
- name: context_emb_13
dtype: float64
- name: context_emb_14
dtype: float64
- name: context_emb_15
dtype: float64
- name: context_emb_16
dtype: float64
- name: context_emb_17
dtype: float64
- name: context_emb_18
dtype: float64
- name: context_emb_19
dtype: float64
- name: context_emb_20
dtype: float64
- name: context_emb_21
dtype: float64
- name: context_emb_22
dtype: float64
- name: context_emb_23
dtype: float64
- name: context_emb_24
dtype: float64
- name: context_emb_25
dtype: float64
- name: context_emb_26
dtype: float64
- name: context_emb_27
dtype: float64
- name: context_emb_28
dtype: float64
- name: context_emb_29
dtype: float64
- name: context_emb_30
dtype: float64
- name: context_emb_31
dtype: float64
- name: context_emb_32
dtype: float64
- name: context_emb_33
dtype: float64
- name: context_emb_34
dtype: float64
- name: context_emb_35
dtype: float64
- name: context_emb_36
dtype: float64
- name: context_emb_37
dtype: float64
- name: context_emb_38
dtype: float64
- name: context_emb_39
dtype: float64
- name: context_emb_40
dtype: float64
- name: context_emb_41
dtype: float64
- name: context_emb_42
dtype: float64
- name: context_emb_43
dtype: float64
- name: context_emb_44
dtype: float64
- name: context_emb_45
dtype: float64
- name: context_emb_46
dtype: float64
- name: context_emb_47
dtype: float64
- name: context_emb_48
dtype: float64
- name: context_emb_49
dtype: float64
- name: context_emb_50
dtype: float64
- name: context_emb_51
dtype: float64
- name: context_emb_52
dtype: float64
- name: context_emb_53
dtype: float64
- name: context_emb_54
dtype: float64
- name: context_emb_55
dtype: float64
- name: context_emb_56
dtype: float64
- name: context_emb_57
dtype: float64
- name: context_emb_58
dtype: float64
- name: context_emb_59
dtype: float64
- name: context_emb_60
dtype: float64
- name: context_emb_61
dtype: float64
- name: context_emb_62
dtype: float64
- name: context_emb_63
dtype: float64
- name: context_emb_64
dtype: float64
- name: context_emb_65
dtype: float64
- name: context_emb_66
dtype: float64
- name: context_emb_67
dtype: float64
- name: context_emb_68
dtype: float64
- name: context_emb_69
dtype: float64
- name: context_emb_70
dtype: float64
- name: context_emb_71
dtype: float64
- name: context_emb_72
dtype: float64
- name: context_emb_73
dtype: float64
- name: context_emb_74
dtype: float64
- name: context_emb_75
dtype: float64
- name: context_emb_76
dtype: float64
- name: context_emb_77
dtype: float64
- name: context_emb_78
dtype: float64
- name: context_emb_79
dtype: float64
- name: context_emb_80
dtype: float64
- name: context_emb_81
dtype: float64
- name: context_emb_82
dtype: float64
- name: context_emb_83
dtype: float64
- name: context_emb_84
dtype: float64
- name: context_emb_85
dtype: float64
- name: context_emb_86
dtype: float64
- name: context_emb_87
dtype: float64
- name: context_emb_88
dtype: float64
- name: context_emb_89
dtype: float64
- name: context_emb_90
dtype: float64
- name: context_emb_91
dtype: float64
- name: context_emb_92
dtype: float64
- name: context_emb_93
dtype: float64
- name: context_emb_94
dtype: float64
- name: context_emb_95
dtype: float64
- name: context_emb_96
dtype: float64
- name: context_emb_97
dtype: float64
- name: context_emb_98
dtype: float64
- name: context_emb_99
dtype: float64
- name: context_emb_100
dtype: float64
- name: context_emb_101
dtype: float64
- name: context_emb_102
dtype: float64
- name: context_emb_103
dtype: float64
- name: context_emb_104
dtype: float64
- name: context_emb_105
dtype: float64
- name: context_emb_106
dtype: float64
- name: context_emb_107
dtype: float64
- name: context_emb_108
dtype: float64
- name: context_emb_109
dtype: float64
- name: context_emb_110
dtype: float64
- name: context_emb_111
dtype: float64
- name: context_emb_112
dtype: float64
- name: context_emb_113
dtype: float64
- name: context_emb_114
dtype: float64
- name: context_emb_115
dtype: float64
- name: context_emb_116
dtype: float64
- name: context_emb_117
dtype: float64
- name: context_emb_118
dtype: float64
- name: context_emb_119
dtype: float64
- name: context_emb_120
dtype: float64
- name: context_emb_121
dtype: float64
- name: context_emb_122
dtype: float64
- name: context_emb_123
dtype: float64
- name: context_emb_124
dtype: float64
- name: context_emb_125
dtype: float64
- name: context_emb_126
dtype: float64
- name: context_emb_127
dtype: float64
- name: context_emb_128
dtype: float64
- name: context_emb_129
dtype: float64
- name: context_emb_130
dtype: float64
- name: context_emb_131
dtype: float64
- name: context_emb_132
dtype: float64
- name: context_emb_133
dtype: float64
- name: context_emb_134
dtype: float64
- name: context_emb_135
dtype: float64
- name: context_emb_136
dtype: float64
- name: context_emb_137
dtype: float64
- name: context_emb_138
dtype: float64
- name: context_emb_139
dtype: float64
- name: context_emb_140
dtype: float64
- name: context_emb_141
dtype: float64
- name: context_emb_142
dtype: float64
- name: context_emb_143
dtype: float64
- name: context_emb_144
dtype: float64
- name: context_emb_145
dtype: float64
- name: context_emb_146
dtype: float64
- name: context_emb_147
dtype: float64
- name: context_emb_148
dtype: float64
- name: context_emb_149
dtype: float64
- name: context_emb_150
dtype: float64
- name: context_emb_151
dtype: float64
- name: context_emb_152
dtype: float64
- name: context_emb_153
dtype: float64
- name: context_emb_154
dtype: float64
- name: context_emb_155
dtype: float64
- name: context_emb_156
dtype: float64
- name: context_emb_157
dtype: float64
- name: context_emb_158
dtype: float64
- name: context_emb_159
dtype: float64
- name: context_emb_160
dtype: float64
- name: context_emb_161
dtype: float64
- name: context_emb_162
dtype: float64
- name: context_emb_163
dtype: float64
- name: context_emb_164
dtype: float64
- name: context_emb_165
dtype: float64
- name: context_emb_166
dtype: float64
- name: context_emb_167
dtype: float64
- name: context_emb_168
dtype: float64
- name: context_emb_169
dtype: float64
- name: context_emb_170
dtype: float64
- name: context_emb_171
dtype: float64
- name: context_emb_172
dtype: float64
- name: context_emb_173
dtype: float64
- name: context_emb_174
dtype: float64
- name: context_emb_175
dtype: float64
- name: context_emb_176
dtype: float64
- name: context_emb_177
dtype: float64
- name: context_emb_178
dtype: float64
- name: context_emb_179
dtype: float64
- name: context_emb_180
dtype: float64
- name: context_emb_181
dtype: float64
- name: context_emb_182
dtype: float64
- name: context_emb_183
dtype: float64
- name: context_emb_184
dtype: float64
- name: context_emb_185
dtype: float64
- name: context_emb_186
dtype: float64
- name: context_emb_187
dtype: float64
- name: context_emb_188
dtype: float64
- name: context_emb_189
dtype: float64
- name: context_emb_190
dtype: float64
- name: context_emb_191
dtype: float64
- name: context_emb_192
dtype: float64
- name: context_emb_193
dtype: float64
- name: context_emb_194
dtype: float64
- name: context_emb_195
dtype: float64
- name: context_emb_196
dtype: float64
- name: context_emb_197
dtype: float64
- name: context_emb_198
dtype: float64
- name: context_emb_199
dtype: float64
- name: context_emb_200
dtype: float64
- name: context_emb_201
dtype: float64
- name: context_emb_202
dtype: float64
- name: context_emb_203
dtype: float64
- name: context_emb_204
dtype: float64
- name: context_emb_205
dtype: float64
- name: context_emb_206
dtype: float64
- name: context_emb_207
dtype: float64
- name: context_emb_208
dtype: float64
- name: context_emb_209
dtype: float64
- name: context_emb_210
dtype: float64
- name: context_emb_211
dtype: float64
- name: context_emb_212
dtype: float64
- name: context_emb_213
dtype: float64
- name: context_emb_214
dtype: float64
- name: context_emb_215
dtype: float64
- name: context_emb_216
dtype: float64
- name: context_emb_217
dtype: float64
- name: context_emb_218
dtype: float64
- name: context_emb_219
dtype: float64
- name: context_emb_220
dtype: float64
- name: context_emb_221
dtype: float64
- name: context_emb_222
dtype: float64
- name: context_emb_223
dtype: float64
- name: context_emb_224
dtype: float64
- name: context_emb_225
dtype: float64
- name: context_emb_226
dtype: float64
- name: context_emb_227
dtype: float64
- name: context_emb_228
dtype: float64
- name: context_emb_229
dtype: float64
- name: context_emb_230
dtype: float64
- name: context_emb_231
dtype: float64
- name: context_emb_232
dtype: float64
- name: context_emb_233
dtype: float64
- name: context_emb_234
dtype: float64
- name: context_emb_235
dtype: float64
- name: context_emb_236
dtype: float64
- name: context_emb_237
dtype: float64
- name: context_emb_238
dtype: float64
- name: context_emb_239
dtype: float64
- name: context_emb_240
dtype: float64
- name: context_emb_241
dtype: float64
- name: context_emb_242
dtype: float64
- name: context_emb_243
dtype: float64
- name: context_emb_244
dtype: float64
- name: context_emb_245
dtype: float64
- name: context_emb_246
dtype: float64
- name: context_emb_247
dtype: float64
- name: context_emb_248
dtype: float64
- name: context_emb_249
dtype: float64
- name: context_emb_250
dtype: float64
- name: context_emb_251
dtype: float64
- name: context_emb_252
dtype: float64
- name: context_emb_253
dtype: float64
- name: context_emb_254
dtype: float64
- name: context_emb_255
dtype: float64
- name: context_emb_256
dtype: float64
- name: context_emb_257
dtype: float64
- name: context_emb_258
dtype: float64
- name: context_emb_259
dtype: float64
- name: context_emb_260
dtype: float64
- name: context_emb_261
dtype: float64
- name: context_emb_262
dtype: float64
- name: context_emb_263
dtype: float64
- name: context_emb_264
dtype: float64
- name: context_emb_265
dtype: float64
- name: context_emb_266
dtype: float64
- name: context_emb_267
dtype: float64
- name: context_emb_268
dtype: float64
- name: context_emb_269
dtype: float64
- name: context_emb_270
dtype: float64
- name: context_emb_271
dtype: float64
- name: context_emb_272
dtype: float64
- name: context_emb_273
dtype: float64
- name: context_emb_274
dtype: float64
- name: context_emb_275
dtype: float64
- name: context_emb_276
dtype: float64
- name: context_emb_277
dtype: float64
- name: context_emb_278
dtype: float64
- name: context_emb_279
dtype: float64
- name: context_emb_280
dtype: float64
- name: context_emb_281
dtype: float64
- name: context_emb_282
dtype: float64
- name: context_emb_283
dtype: float64
- name: context_emb_284
dtype: float64
- name: context_emb_285
dtype: float64
- name: context_emb_286
dtype: float64
- name: context_emb_287
dtype: float64
- name: context_emb_288
dtype: float64
- name: context_emb_289
dtype: float64
- name: context_emb_290
dtype: float64
- name: context_emb_291
dtype: float64
- name: context_emb_292
dtype: float64
- name: context_emb_293
dtype: float64
- name: context_emb_294
dtype: float64
- name: context_emb_295
dtype: float64
- name: context_emb_296
dtype: float64
- name: context_emb_297
dtype: float64
- name: context_emb_298
dtype: float64
- name: context_emb_299
dtype: float64
- name: context_emb_300
dtype: float64
- name: context_emb_301
dtype: float64
- name: context_emb_302
dtype: float64
- name: context_emb_303
dtype: float64
- name: context_emb_304
dtype: float64
- name: context_emb_305
dtype: float64
- name: context_emb_306
dtype: float64
- name: context_emb_307
dtype: float64
- name: context_emb_308
dtype: float64
- name: context_emb_309
dtype: float64
- name: context_emb_310
dtype: float64
- name: context_emb_311
dtype: float64
- name: context_emb_312
dtype: float64
- name: context_emb_313
dtype: float64
- name: context_emb_314
dtype: float64
- name: context_emb_315
dtype: float64
- name: context_emb_316
dtype: float64
- name: context_emb_317
dtype: float64
- name: context_emb_318
dtype: float64
- name: context_emb_319
dtype: float64
- name: context_emb_320
dtype: float64
- name: context_emb_321
dtype: float64
- name: context_emb_322
dtype: float64
- name: context_emb_323
dtype: float64
- name: context_emb_324
dtype: float64
- name: context_emb_325
dtype: float64
- name: context_emb_326
dtype: float64
- name: context_emb_327
dtype: float64
- name: context_emb_328
dtype: float64
- name: context_emb_329
dtype: float64
- name: context_emb_330
dtype: float64
- name: context_emb_331
dtype: float64
- name: context_emb_332
dtype: float64
- name: context_emb_333
dtype: float64
- name: context_emb_334
dtype: float64
- name: context_emb_335
dtype: float64
- name: context_emb_336
dtype: float64
- name: context_emb_337
dtype: float64
- name: context_emb_338
dtype: float64
- name: context_emb_339
dtype: float64
- name: context_emb_340
dtype: float64
- name: context_emb_341
dtype: float64
- name: context_emb_342
dtype: float64
- name: context_emb_343
dtype: float64
- name: context_emb_344
dtype: float64
- name: context_emb_345
dtype: float64
- name: context_emb_346
dtype: float64
- name: context_emb_347
dtype: float64
- name: context_emb_348
dtype: float64
- name: context_emb_349
dtype: float64
- name: context_emb_350
dtype: float64
- name: context_emb_351
dtype: float64
- name: context_emb_352
dtype: float64
- name: context_emb_353
dtype: float64
- name: context_emb_354
dtype: float64
- name: context_emb_355
dtype: float64
- name: context_emb_356
dtype: float64
- name: context_emb_357
dtype: float64
- name: context_emb_358
dtype: float64
- name: context_emb_359
dtype: float64
- name: context_emb_360
dtype: float64
- name: context_emb_361
dtype: float64
- name: context_emb_362
dtype: float64
- name: context_emb_363
dtype: float64
- name: context_emb_364
dtype: float64
- name: context_emb_365
dtype: float64
- name: context_emb_366
dtype: float64
- name: context_emb_367
dtype: float64
- name: context_emb_368
dtype: float64
- name: context_emb_369
dtype: float64
- name: context_emb_370
dtype: float64
- name: context_emb_371
dtype: float64
- name: context_emb_372
dtype: float64
- name: context_emb_373
dtype: float64
- name: context_emb_374
dtype: float64
- name: context_emb_375
dtype: float64
- name: context_emb_376
dtype: float64
- name: context_emb_377
dtype: float64
- name: context_emb_378
dtype: float64
- name: context_emb_379
dtype: float64
- name: context_emb_380
dtype: float64
- name: context_emb_381
dtype: float64
- name: context_emb_382
dtype: float64
- name: context_emb_383
dtype: float64
- name: context_emb_384
dtype: float64
- name: context_emb_385
dtype: float64
- name: context_emb_386
dtype: float64
- name: context_emb_387
dtype: float64
- name: context_emb_388
dtype: float64
- name: context_emb_389
dtype: float64
- name: context_emb_390
dtype: float64
- name: context_emb_391
dtype: float64
- name: context_emb_392
dtype: float64
- name: context_emb_393
dtype: float64
- name: context_emb_394
dtype: float64
- name: context_emb_395
dtype: float64
- name: context_emb_396
dtype: float64
- name: context_emb_397
dtype: float64
- name: context_emb_398
dtype: float64
- name: context_emb_399
dtype: float64
- name: context_emb_400
dtype: float64
- name: context_emb_401
dtype: float64
- name: context_emb_402
dtype: float64
- name: context_emb_403
dtype: float64
- name: context_emb_404
dtype: float64
- name: context_emb_405
dtype: float64
- name: context_emb_406
dtype: float64
- name: context_emb_407
dtype: float64
- name: context_emb_408
dtype: float64
- name: context_emb_409
dtype: float64
- name: context_emb_410
dtype: float64
- name: context_emb_411
dtype: float64
- name: context_emb_412
dtype: float64
- name: context_emb_413
dtype: float64
- name: context_emb_414
dtype: float64
- name: context_emb_415
dtype: float64
- name: context_emb_416
dtype: float64
- name: context_emb_417
dtype: float64
- name: context_emb_418
dtype: float64
- name: context_emb_419
dtype: float64
- name: context_emb_420
dtype: float64
- name: context_emb_421
dtype: float64
- name: context_emb_422
dtype: float64
- name: context_emb_423
dtype: float64
- name: context_emb_424
dtype: float64
- name: context_emb_425
dtype: float64
- name: context_emb_426
dtype: float64
- name: context_emb_427
dtype: float64
- name: context_emb_428
dtype: float64
- name: context_emb_429
dtype: float64
- name: context_emb_430
dtype: float64
- name: context_emb_431
dtype: float64
- name: context_emb_432
dtype: float64
- name: context_emb_433
dtype: float64
- name: context_emb_434
dtype: float64
- name: context_emb_435
dtype: float64
- name: context_emb_436
dtype: float64
- name: context_emb_437
dtype: float64
- name: context_emb_438
dtype: float64
- name: context_emb_439
dtype: float64
- name: context_emb_440
dtype: float64
- name: context_emb_441
dtype: float64
- name: context_emb_442
dtype: float64
- name: context_emb_443
dtype: float64
- name: context_emb_444
dtype: float64
- name: context_emb_445
dtype: float64
- name: context_emb_446
dtype: float64
- name: context_emb_447
dtype: float64
- name: context_emb_448
dtype: float64
- name: context_emb_449
dtype: float64
- name: context_emb_450
dtype: float64
- name: context_emb_451
dtype: float64
- name: context_emb_452
dtype: float64
- name: context_emb_453
dtype: float64
- name: context_emb_454
dtype: float64
- name: context_emb_455
dtype: float64
- name: context_emb_456
dtype: float64
- name: context_emb_457
dtype: float64
- name: context_emb_458
dtype: float64
- name: context_emb_459
dtype: float64
- name: context_emb_460
dtype: float64
- name: context_emb_461
dtype: float64
- name: context_emb_462
dtype: float64
- name: context_emb_463
dtype: float64
- name: context_emb_464
dtype: float64
- name: context_emb_465
dtype: float64
- name: context_emb_466
dtype: float64
- name: context_emb_467
dtype: float64
- name: context_emb_468
dtype: float64
- name: context_emb_469
dtype: float64
- name: context_emb_470
dtype: float64
- name: context_emb_471
dtype: float64
- name: context_emb_472
dtype: float64
- name: context_emb_473
dtype: float64
- name: context_emb_474
dtype: float64
- name: context_emb_475
dtype: float64
- name: context_emb_476
dtype: float64
- name: context_emb_477
dtype: float64
- name: context_emb_478
dtype: float64
- name: context_emb_479
dtype: float64
- name: context_emb_480
dtype: float64
- name: context_emb_481
dtype: float64
- name: context_emb_482
dtype: float64
- name: context_emb_483
dtype: float64
- name: context_emb_484
dtype: float64
- name: context_emb_485
dtype: float64
- name: context_emb_486
dtype: float64
- name: context_emb_487
dtype: float64
- name: context_emb_488
dtype: float64
- name: context_emb_489
dtype: float64
- name: context_emb_490
dtype: float64
- name: context_emb_491
dtype: float64
- name: context_emb_492
dtype: float64
- name: context_emb_493
dtype: float64
- name: context_emb_494
dtype: float64
- name: context_emb_495
dtype: float64
- name: context_emb_496
dtype: float64
- name: context_emb_497
dtype: float64
- name: context_emb_498
dtype: float64
- name: context_emb_499
dtype: float64
- name: context_emb_500
dtype: float64
- name: context_emb_501
dtype: float64
- name: context_emb_502
dtype: float64
- name: context_emb_503
dtype: float64
- name: context_emb_504
dtype: float64
- name: context_emb_505
dtype: float64
- name: context_emb_506
dtype: float64
- name: context_emb_507
dtype: float64
- name: context_emb_508
dtype: float64
- name: context_emb_509
dtype: float64
- name: context_emb_510
dtype: float64
- name: context_emb_511
dtype: float64
- name: context_emb_512
dtype: float64
- name: context_emb_513
dtype: float64
- name: context_emb_514
dtype: float64
- name: context_emb_515
dtype: float64
- name: context_emb_516
dtype: float64
- name: context_emb_517
dtype: float64
- name: context_emb_518
dtype: float64
- name: context_emb_519
dtype: float64
- name: context_emb_520
dtype: float64
- name: context_emb_521
dtype: float64
- name: context_emb_522
dtype: float64
- name: context_emb_523
dtype: float64
- name: context_emb_524
dtype: float64
- name: context_emb_525
dtype: float64
- name: context_emb_526
dtype: float64
- name: context_emb_527
dtype: float64
- name: context_emb_528
dtype: float64
- name: context_emb_529
dtype: float64
- name: context_emb_530
dtype: float64
- name: context_emb_531
dtype: float64
- name: context_emb_532
dtype: float64
- name: context_emb_533
dtype: float64
- name: context_emb_534
dtype: float64
- name: context_emb_535
dtype: float64
- name: context_emb_536
dtype: float64
- name: context_emb_537
dtype: float64
- name: context_emb_538
dtype: float64
- name: context_emb_539
dtype: float64
- name: context_emb_540
dtype: float64
- name: context_emb_541
dtype: float64
- name: context_emb_542
dtype: float64
- name: context_emb_543
dtype: float64
- name: context_emb_544
dtype: float64
- name: context_emb_545
dtype: float64
- name: context_emb_546
dtype: float64
- name: context_emb_547
dtype: float64
- name: context_emb_548
dtype: float64
- name: context_emb_549
dtype: float64
- name: context_emb_550
dtype: float64
- name: context_emb_551
dtype: float64
- name: context_emb_552
dtype: float64
- name: context_emb_553
dtype: float64
- name: context_emb_554
dtype: float64
- name: context_emb_555
dtype: float64
- name: context_emb_556
dtype: float64
- name: context_emb_557
dtype: float64
- name: context_emb_558
dtype: float64
- name: context_emb_559
dtype: float64
- name: context_emb_560
dtype: float64
- name: context_emb_561
dtype: float64
- name: context_emb_562
dtype: float64
- name: context_emb_563
dtype: float64
- name: context_emb_564
dtype: float64
- name: context_emb_565
dtype: float64
- name: context_emb_566
dtype: float64
- name: context_emb_567
dtype: float64
- name: context_emb_568
dtype: float64
- name: context_emb_569
dtype: float64
- name: context_emb_570
dtype: float64
- name: context_emb_571
dtype: float64
- name: context_emb_572
dtype: float64
- name: context_emb_573
dtype: float64
- name: context_emb_574
dtype: float64
- name: context_emb_575
dtype: float64
- name: context_emb_576
dtype: float64
- name: context_emb_577
dtype: float64
- name: context_emb_578
dtype: float64
- name: context_emb_579
dtype: float64
- name: context_emb_580
dtype: float64
- name: context_emb_581
dtype: float64
- name: context_emb_582
dtype: float64
- name: context_emb_583
dtype: float64
- name: context_emb_584
dtype: float64
- name: context_emb_585
dtype: float64
- name: context_emb_586
dtype: float64
- name: context_emb_587
dtype: float64
- name: context_emb_588
dtype: float64
- name: context_emb_589
dtype: float64
- name: context_emb_590
dtype: float64
- name: context_emb_591
dtype: float64
- name: context_emb_592
dtype: float64
- name: context_emb_593
dtype: float64
- name: context_emb_594
dtype: float64
- name: context_emb_595
dtype: float64
- name: context_emb_596
dtype: float64
- name: context_emb_597
dtype: float64
- name: context_emb_598
dtype: float64
- name: context_emb_599
dtype: float64
- name: context_emb_600
dtype: float64
- name: context_emb_601
dtype: float64
- name: context_emb_602
dtype: float64
- name: context_emb_603
dtype: float64
- name: context_emb_604
dtype: float64
- name: context_emb_605
dtype: float64
- name: context_emb_606
dtype: float64
- name: context_emb_607
dtype: float64
- name: context_emb_608
dtype: float64
- name: context_emb_609
dtype: float64
- name: context_emb_610
dtype: float64
- name: context_emb_611
dtype: float64
- name: context_emb_612
dtype: float64
- name: context_emb_613
dtype: float64
- name: context_emb_614
dtype: float64
- name: context_emb_615
dtype: float64
- name: context_emb_616
dtype: float64
- name: context_emb_617
dtype: float64
- name: context_emb_618
dtype: float64
- name: context_emb_619
dtype: float64
- name: context_emb_620
dtype: float64
- name: context_emb_621
dtype: float64
- name: context_emb_622
dtype: float64
- name: context_emb_623
dtype: float64
- name: context_emb_624
dtype: float64
- name: context_emb_625
dtype: float64
- name: context_emb_626
dtype: float64
- name: context_emb_627
dtype: float64
- name: context_emb_628
dtype: float64
- name: context_emb_629
dtype: float64
- name: context_emb_630
dtype: float64
- name: context_emb_631
dtype: float64
- name: context_emb_632
dtype: float64
- name: context_emb_633
dtype: float64
- name: context_emb_634
dtype: float64
- name: context_emb_635
dtype: float64
- name: context_emb_636
dtype: float64
- name: context_emb_637
dtype: float64
- name: context_emb_638
dtype: float64
- name: context_emb_639
dtype: float64
- name: context_emb_640
dtype: float64
- name: context_emb_641
dtype: float64
- name: context_emb_642
dtype: float64
- name: context_emb_643
dtype: float64
- name: context_emb_644
dtype: float64
- name: context_emb_645
dtype: float64
- name: context_emb_646
dtype: float64
- name: context_emb_647
dtype: float64
- name: context_emb_648
dtype: float64
- name: context_emb_649
dtype: float64
- name: context_emb_650
dtype: float64
- name: context_emb_651
dtype: float64
- name: context_emb_652
dtype: float64
- name: context_emb_653
dtype: float64
- name: context_emb_654
dtype: float64
- name: context_emb_655
dtype: float64
- name: context_emb_656
dtype: float64
- name: context_emb_657
dtype: float64
- name: context_emb_658
dtype: float64
- name: context_emb_659
dtype: float64
- name: context_emb_660
dtype: float64
- name: context_emb_661
dtype: float64
- name: context_emb_662
dtype: float64
- name: context_emb_663
dtype: float64
- name: context_emb_664
dtype: float64
- name: context_emb_665
dtype: float64
- name: context_emb_666
dtype: float64
- name: context_emb_667
dtype: float64
- name: context_emb_668
dtype: float64
- name: context_emb_669
dtype: float64
- name: context_emb_670
dtype: float64
- name: context_emb_671
dtype: float64
- name: context_emb_672
dtype: float64
- name: context_emb_673
dtype: float64
- name: context_emb_674
dtype: float64
- name: context_emb_675
dtype: float64
- name: context_emb_676
dtype: float64
- name: context_emb_677
dtype: float64
- name: context_emb_678
dtype: float64
- name: context_emb_679
dtype: float64
- name: context_emb_680
dtype: float64
- name: context_emb_681
dtype: float64
- name: context_emb_682
dtype: float64
- name: context_emb_683
dtype: float64
- name: context_emb_684
dtype: float64
- name: context_emb_685
dtype: float64
- name: context_emb_686
dtype: float64
- name: context_emb_687
dtype: float64
- name: context_emb_688
dtype: float64
- name: context_emb_689
dtype: float64
- name: context_emb_690
dtype: float64
- name: context_emb_691
dtype: float64
- name: context_emb_692
dtype: float64
- name: context_emb_693
dtype: float64
- name: context_emb_694
dtype: float64
- name: context_emb_695
dtype: float64
- name: context_emb_696
dtype: float64
- name: context_emb_697
dtype: float64
- name: context_emb_698
dtype: float64
- name: context_emb_699
dtype: float64
- name: context_emb_700
dtype: float64
- name: context_emb_701
dtype: float64
- name: context_emb_702
dtype: float64
- name: context_emb_703
dtype: float64
- name: context_emb_704
dtype: float64
- name: context_emb_705
dtype: float64
- name: context_emb_706
dtype: float64
- name: context_emb_707
dtype: float64
- name: context_emb_708
dtype: float64
- name: context_emb_709
dtype: float64
- name: context_emb_710
dtype: float64
- name: context_emb_711
dtype: float64
- name: context_emb_712
dtype: float64
- name: context_emb_713
dtype: float64
- name: context_emb_714
dtype: float64
- name: context_emb_715
dtype: float64
- name: context_emb_716
dtype: float64
- name: context_emb_717
dtype: float64
- name: context_emb_718
dtype: float64
- name: context_emb_719
dtype: float64
- name: context_emb_720
dtype: float64
- name: context_emb_721
dtype: float64
- name: context_emb_722
dtype: float64
- name: context_emb_723
dtype: float64
- name: context_emb_724
dtype: float64
- name: context_emb_725
dtype: float64
- name: context_emb_726
dtype: float64
- name: context_emb_727
dtype: float64
- name: context_emb_728
dtype: float64
- name: context_emb_729
dtype: float64
- name: context_emb_730
dtype: float64
- name: context_emb_731
dtype: float64
- name: context_emb_732
dtype: float64
- name: context_emb_733
dtype: float64
- name: context_emb_734
dtype: float64
- name: context_emb_735
dtype: float64
- name: context_emb_736
dtype: float64
- name: context_emb_737
dtype: float64
- name: context_emb_738
dtype: float64
- name: context_emb_739
dtype: float64
- name: context_emb_740
dtype: float64
- name: context_emb_741
dtype: float64
- name: context_emb_742
dtype: float64
- name: context_emb_743
dtype: float64
- name: context_emb_744
dtype: float64
- name: context_emb_745
dtype: float64
- name: context_emb_746
dtype: float64
- name: context_emb_747
dtype: float64
- name: context_emb_748
dtype: float64
- name: context_emb_749
dtype: float64
- name: context_emb_750
dtype: float64
- name: context_emb_751
dtype: float64
- name: context_emb_752
dtype: float64
- name: context_emb_753
dtype: float64
- name: context_emb_754
dtype: float64
- name: context_emb_755
dtype: float64
- name: context_emb_756
dtype: float64
- name: context_emb_757
dtype: float64
- name: context_emb_758
dtype: float64
- name: context_emb_759
dtype: float64
- name: context_emb_760
dtype: float64
- name: context_emb_761
dtype: float64
- name: context_emb_762
dtype: float64
- name: context_emb_763
dtype: float64
- name: context_emb_764
dtype: float64
- name: context_emb_765
dtype: float64
- name: context_emb_766
dtype: float64
- name: context_emb_767
dtype: float64
- name: bm25_score
dtype: float64
- name: cos_sim_score
dtype: float64
- name: dotp_sim_score
dtype: float64
- name: meta_bm25_score
dtype: float64
- name: meta_cos_sim_score
dtype: float64
- name: meta_dotp_sim_score
dtype: float64
- name: summarized_bm25_score
dtype: float64
- name: summarized_cos_sim_score
dtype: float64
- name: summarized_dotp_sim_score
dtype: float64
- name: label
dtype: float64
splits:
- name: train
num_bytes: 746334592
num_examples: 60344
download_size: 251628811
dataset_size: 746334592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
trip2fun/autotrain-data-hstv-cc-help_v01 | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: hstv-cc-help_v01
## Dataset Description
This dataset has been automatically processed by AutoTrain for project hstv-cc-help_v01.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Product Name",
"feat_\u200b\u200bHuggimalz\u200b Unicorn Soft Plush Toy": null,
"target": 4,
"feat_\u00a329.99": null,
"feat_What products do you offer?": null,
"feat_We offer a wide range of products including the Power XL Vortex PRO - 4L Digital Air Fryer, Drew&Cole Adoro Pizza Oven, Nutribullet Smart Touch Blender Combo, SmartAir BOOST Radiator Fan, and many more.": null,
"feat_Ollyball \u2013 The Ultimate Indoor Play Ball": "Nutribullet 600 Series Starter Kit",
"feat_Now you can play ball in the house - Hit it, kick it, colour it in Ollyball is perfect for full-speed indoors without breaking windows or leaving a nasty bruise The 30cm super lightweight inflatable ball, with special KrunchCOR construction, absorbs the impact from full-speed hits and kicks.": null,
"feat_SAVE \u00a310": null,
"feat_As low as \u00a317.99": "\u00a359.99",
"feat_https://www.highstreettv.com/media/catalog/product/cache/f158af82292ec3d0638e111a17ec7f2d/o/l/ollyball_web_images_cd333_72dpi_02_3.jpg": null,
"feat_Happy Nappers - Disco Dolphin - Medium (ages 3 to 6)": null,
"feat_5.0 Stars-Reviews 2 ": null
},
{
"text": "Product Name",
"feat_\u200b\u200bHuggimalz\u200b Unicorn Soft Plush Toy": "Like New - Nutribullet 1200 Series",
"target": 1,
"feat_\u00a329.99": "\u00a3119.99",
"feat_What products do you offer?": null,
"feat_We offer a wide range of products including the Power XL Vortex PRO - 4L Digital Air Fryer, Drew&Cole Adoro Pizza Oven, Nutribullet Smart Touch Blender Combo, SmartAir BOOST Radiator Fan, and many more.": null,
"feat_Ollyball \u2013 The Ultimate Indoor Play Ball": null,
"feat_Now you can play ball in the house - Hit it, kick it, colour it in Ollyball is perfect for full-speed indoors without breaking windows or leaving a nasty bruise The 30cm super lightweight inflatable ball, with special KrunchCOR construction, absorbs the impact from full-speed hits and kicks.": null,
"feat_SAVE \u00a310": null,
"feat_As low as \u00a317.99": null,
"feat_https://www.highstreettv.com/media/catalog/product/cache/f158af82292ec3d0638e111a17ec7f2d/o/l/ollyball_web_images_cd333_72dpi_02_3.jpg": null,
"feat_Happy Nappers - Disco Dolphin - Medium (ages 3 to 6)": null,
"feat_5.0 Stars-Reviews 2 ": null
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"feat_\u200b\u200bHuggimalz\u200b Unicorn Soft Plush Toy": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=[' Stars-Reviews', 'Before Price', 'Description', 'Discount', 'Final Price', 'Product Photo', 'Response:'], id=None)",
"feat_\u00a329.99": "Value(dtype='string', id=None)",
"feat_What products do you offer?": "Value(dtype='string', id=None)",
"feat_We offer a wide range of products including the Power XL Vortex PRO - 4L Digital Air Fryer, Drew&Cole Adoro Pizza Oven, Nutribullet Smart Touch Blender Combo, SmartAir BOOST Radiator Fan, and many more.": "Value(dtype='string', id=None)",
"feat_Ollyball \u2013 The Ultimate Indoor Play Ball": "Value(dtype='string', id=None)",
"feat_Now you can play ball in the house - Hit it, kick it, colour it in Ollyball is perfect for full-speed indoors without breaking windows or leaving a nasty bruise The 30cm super lightweight inflatable ball, with special KrunchCOR construction, absorbs the impact from full-speed hits and kicks.": "Value(dtype='string', id=None)",
"feat_SAVE \u00a310": "Value(dtype='string', id=None)",
"feat_As low as \u00a317.99": "Value(dtype='string', id=None)",
"feat_https://www.highstreettv.com/media/catalog/product/cache/f158af82292ec3d0638e111a17ec7f2d/o/l/ollyball_web_images_cd333_72dpi_02_3.jpg": "Value(dtype='string', id=None)",
"feat_Happy Nappers - Disco Dolphin - Medium (ages 3 to 6)": "Value(dtype='string', id=None)",
"feat_5.0 Stars-Reviews 2 ": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2786 |
| valid | 699 |
|
ihsanenginbal/earthquake_wavelets | ---
license: mit
---
We produced RGB pictures where the X axis is the time in seconds, the Y axis is the frequency of the waves and the color intensity represents the energy content of the wave at that moment, within that frequency.
The wavelets represent 120sec long records.
This dataset has wavelet pictures from earthquakes, from stormy days, from rush hours as well as from sleepy hours during a day.
In order to better understand how the wavelets are produced, you can check the below code with which we produced the wavelets.
import pywt
import numpy as np
import matplotlib.pyplot as plt
from matplotlib import cm
from pywt import scale2frequency
import math
from scipy.io import loadmat
from joblib import delayed, Parallel
from tqdm import tqdm
from matplotlib.pyplot import figure
def plot(data, fig_index):
# parmaters for cwt #
deltat: float = 0.005
wavelet = 'morl'
per1 = 1 / 20 # minimum Frequency
per2 = 1 / 0.1 # minimum Frequency
cfreq = 0.8125 # default value for morl cwt #
scale1 = per1 * cfreq / deltat
scale2 = per2 * cfreq / deltat
scales = np.arange(scale1, scale2 + (scale2 - scale1) / 15, (scale2 - scale1) / 15)
coefs, freqs = pywt.cwt(data, scales, wavelet) # python cwt function
coeflist = np.array(coefs) # to convert from tuple to ndarray
S = np.sqrt(np.absolute(coeflist))
SC = (100 * S) / np.sum(np.array(S).flatten())
F = pywt.scale2frequency(wavelet, scales) / deltat
time = np.arange(deltat, (len(data) * deltat) + deltat, deltat)
plt.contourf(time, 1 / F, SC, cmap=cm.jet)
# scrsz = [100,100,1000,700];
# plt.figure(figsize=(100, 200, scrsz(2)*0.80, scrsz(3)*0.6))
plt.axis('off')
plt.savefig('Storm_Plots/contourf_' + str(fig_index) + '.png', dpi=500)
# plt.show()
#### plot screen size should be fixed ###
def process(record, index):
plot(record.flatten(), fig_index=index + 1)
EQ = loadmat('Storm_Data_Katerina_All.mat')
records=EQ['Storm_Data_Katerina_All']['recData']
print(len(records[0]))
Parallel(n_jobs=10)(delayed(process)(record, index) for index, record in enumerate(tqdm(records[0])))
|
jahb57/gpt2_embeddings_BATCH_8 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
splits:
- name: train
num_bytes: 18869855146
num_examples: 100000
download_size: 18918220794
dataset_size: 18869855146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
generordo/nailson | ---
license: openrail
---
|
myradeng/diffusion_db_5k_val_v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: uint32
- name: step
dtype: uint16
- name: cfg
dtype: float32
- name: sampler
dtype: string
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: user_name
dtype: string
- name: timestamp
dtype: timestamp[us, tz=UTC]
- name: image_nsfw
dtype: float32
- name: prompt_nsfw
dtype: float32
splits:
- name: train
num_bytes: 458258338.8
num_examples: 1000
download_size: 458124179
dataset_size: 458258338.8
---
# Dataset Card for "diffusion_db_5k_val_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sai-Manisha/dataset-feb-6 | ---
license: mit
---
|
CVasNLPExperiments/OK-VQA_test_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 185064
num_examples: 100
download_size: 102042
dataset_size: 185064
---
# Dataset Card for "OK-VQA_test_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AakashShah7/ImageDataset | ---
license: apache-2.0
---
|
Exterus/Language | ---
license: other
---
|
vapecig/promptsai | ---
license: bsd
task_categories:
- text-generation
language:
- en
pretty_name: Awesome chatGPT prompts
size_categories:
- n<1K
---
Thanks and please support:
Ecigator is one of the well-known vape brands spun off from Giftsoar Technology Co., Ltd, it’s an ISO-certified [disposable vape manufacturer](https://ecigator.com/) for OEMs, ODMs, and OBM since 2010.
[https://ecigator.com/](https://ecigator.com/) |
Back-up/chung-khoan-demo-p11 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 118339029
num_examples: 24040
download_size: 41554503
dataset_size: 118339029
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JesusMaginge/modelo.de.entrenamiento | ---
license: openrail
---
|
gweg/boys | ---
pretty_name: 'Game boys genus male '
---
boys <3 |
BarraHome/ultrafeedback_binarized | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: train_prefs
num_bytes: 405637300
num_examples: 61135
- name: train_sft
num_bytes: 405637300
num_examples: 61135
- name: test_prefs
num_bytes: 13176789
num_examples: 2000
- name: test_sft
num_bytes: 6701456
num_examples: 1000
- name: train_gen
num_bytes: 324989174
num_examples: 61135
- name: test_gen
num_bytes: 5341818
num_examples: 1000
download_size: 649878235
dataset_size: 1161483837
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: train_sft
path: data/train_sft-*
- split: test_prefs
path: data/test_prefs-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
license: mit
task_categories:
- conversational
language:
- en
size_categories:
- 100K<n<1M
--- |
mstz/kddcup | ---
language:
- en
tags:
- kddcup
- tabular_classification
- binary_classification
pretty_name: Kddcup
task_categories: # Full list at https://github.com/huggingface/hub-docs/blob/main/js/src/lib/interfaces/Types.ts
- tabular-classification
configs:
- kddcup
---
# Kddcup
The Kddcup dataset.
# Configurations and tasks
| **Configuration** | **Task** |
|-----------------------|---------------------------|
| kddcup | Multiclass classification.|
|
fvr2/dataset-test01 | ---
task_categories:
- text-generation
language:
- en
tags:
- music
--- |
Falah/men_fashion_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 420748919
num_examples: 1000000
download_size: 57477342
dataset_size: 420748919
---
# Dataset Card for "men_fashion_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
karukas/mediasum-summary-matching | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 4149687650
num_examples: 443596
- name: validation
num_bytes: 92028438
num_examples: 10000
- name: test
num_bytes: 94033599
num_examples: 10000
download_size: 2438334598
dataset_size: 4335749687
---
# Dataset Card for "mediasum-summary-matching"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ajsmith/ala2 | ---
license: mit
---
|
mask-distilled-one-sec-cv12/chunk_119 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1367975984
num_examples: 268652
download_size: 1396440566
dataset_size: 1367975984
---
# Dataset Card for "chunk_119"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/prompt_dataset_p43_reformulated_2 | ---
dataset_info:
features:
- name: response
dtype: string
- name: rewriten
dtype: string
splits:
- name: train
num_bytes: 276556
num_examples: 100
download_size: 135171
dataset_size: 276556
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Retsadila/ritsu | ---
license: creativeml-openrail-m
---
This is a child voice dataset, trained on old singing samples
|
Rodr16020/code_instructions_7_5k_alpaca_spanish | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction_text
dtype: string
- name: llama2_chat_inst
dtype: string
splits:
- name: train
num_bytes: 15796815
num_examples: 7500
download_size: 7459672
dataset_size: 15796815
---
# Dataset Card for "code_instructions_7_5k_alpaca_spanish"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VQA-CityU/IQA_data | ---
license: apache-2.0
---
|
fbaigt/schema-to-json | ---
license: gpl-3.0
configs:
- config_name: chemtables
data_files:
- split: train
path: chemtables/train-*
- split: validation
path: chemtables/validation-*
- split: test
path: chemtables/test-*
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- config_name: discomat
data_files:
- split: train
path: discomat/train-*
- split: validation
path: discomat/validation-*
- split: test
path: discomat/test-*
- config_name: mltables
data_files:
- split: train
path: mltables/train-*
- split: validation
path: mltables/validation-*
- split: test
path: mltables/test-*
dataset_info:
- config_name: chemtables
features:
- name: paper_id
dtype: string
- name: table_id
dtype: string
- name: table_code
dtype: string
- name: sup_text
dtype: string
- name: target_cells
sequence:
- name: cell_value
dtype: string
- name: cell_raw
dtype: string
- name: cell_index
dtype: string
- name: cell_row_idx
dtype: int32
- name: cell_col_idx
dtype: int32
- name: gold_json_records
sequence:
- name: cell_index
dtype: string
- name: cell_record
dtype: string
splits:
- name: train
num_bytes: 92180
num_examples: 9
- name: validation
num_bytes: 39374
num_examples: 3
- name: test
num_bytes: 117148
num_examples: 14
download_size: 124818
dataset_size: 248702
- config_name: default
features:
- name: paper_id
dtype: string
- name: table_id
dtype: string
- name: table_code
dtype: string
- name: sup_text
dtype: string
- name: target_cells
sequence:
- name: cell_value
dtype: string
- name: cell_raw
dtype: string
- name: cell_index
dtype: string
- name: cell_row_idx
dtype: int32
- name: cell_col_idx
dtype: int32
- name: gold_json_records
sequence:
- name: cell_index
dtype: string
- name: cell_record
dtype: string
splits:
- name: train
num_bytes: 78484
num_examples: 9
- name: validation
num_bytes: 37457
num_examples: 3
- name: test
num_bytes: 113119
num_examples: 14
download_size: 122465
dataset_size: 229060
- config_name: discomat
features:
- name: paper_id
dtype: string
- name: table_id
dtype: string
- name: table_code
dtype: string
- name: sup_text
dtype: string
- name: target_cells
sequence:
- name: cell_value_processed
dtype: string
- name: i
dtype: int32
- name: j
dtype: int32
- name: k
dtype: int32
- name: gold_json_records
sequence:
- name: cell_index
sequence: int32
length: 3
- name: cell_record
dtype: string
splits:
- name: train
num_bytes: 2300237
num_examples: 500
- name: validation
num_bytes: 2300237
num_examples: 500
- name: test
num_bytes: 2366158
num_examples: 487
download_size: 1430344
dataset_size: 6966632
- config_name: mltables
features:
- name: paper_id
dtype: string
- name: table_id
dtype: string
- name: table_code
dtype: string
- name: sup_text
dtype: string
- name: target_cells
sequence:
- name: cell_value
dtype: string
- name: cell_raw
dtype: string
- name: cell_value_char_idx_start
dtype: int32
- name: cell_value_char_idx_end
dtype: int32
- name: cell_raw_char_idx_start
dtype: int32
- name: cell_raw_char_idx_end
dtype: int32
- name: gold_json_records
sequence:
- name: cell_char_index
sequence: int32
length: 2
- name: cell_record
dtype: string
splits:
- name: train
num_bytes: 696651
num_examples: 43
- name: validation
num_bytes: 150816
num_examples: 11
- name: test
num_bytes: 1248693
num_examples: 68
download_size: 605737
dataset_size: 2096160
---
|
msznajder/databricks-dolly-llama2-chat-15k | ---
dataset_info:
features:
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 14946338
num_examples: 15011
download_size: 5006213
dataset_size: 14946338
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ebisu_eika_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ebisu_eika (Touhou)
This is the dataset of ebisu_eika (Touhou), containing 132 images and their tags.
The core tags of this character are `bangs, long_hair, red_eyes, blonde_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 132 | 122.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebisu_eika_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 132 | 81.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebisu_eika_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 266 | 155.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebisu_eika_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 132 | 112.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebisu_eika_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 266 | 196.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ebisu_eika_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ebisu_eika_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, barefoot, frilled_shirt, frilled_skirt, full_body, long_earlobes, looking_at_viewer, puffy_short_sleeves, skirt_set, solo, white_shirt, white_skirt, blouse, brown_eyes, rock, simple_background, sitting, stone, white_background, dark-skinned_female, open_mouth, toes, :d, blush_stickers, feet, medium_hair |
| 1 | 5 |  |  |  |  |  | 1girl, long_earlobes, open_mouth, puffy_short_sleeves, solo, white_shirt, frilled_shirt, looking_at_viewer, rock, stone, white_skirt, :d, blush, holding, jellyfish, skirt_set, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, long_earlobes, puffy_short_sleeves, solo, upper_body, dress, open_mouth, simple_background, white_shirt, looking_at_viewer, white_background, blush_stickers, brown_eyes, grey_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | barefoot | frilled_shirt | frilled_skirt | full_body | long_earlobes | looking_at_viewer | puffy_short_sleeves | skirt_set | solo | white_shirt | white_skirt | blouse | brown_eyes | rock | simple_background | sitting | stone | white_background | dark-skinned_female | open_mouth | toes | :d | blush_stickers | feet | medium_hair | blush | holding | jellyfish | upper_body | dress | grey_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------------|:----------------|:------------|:----------------|:--------------------|:----------------------|:------------|:-------|:--------------|:--------------|:---------|:-------------|:-------|:--------------------|:----------|:--------|:-------------------|:----------------------|:-------------|:-------|:-----|:-----------------|:-------|:--------------|:--------|:----------|:------------|:-------------|:--------|:------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | | X | X | X | X | X | X | X | | | X | | | X | | | X | | X | | | | X | X | X | X | | |
| 2 | 5 |  |  |  |  |  | X | | | | | X | X | X | | X | X | | | X | | X | | | X | | X | | | X | | | | | | X | X | X |
|
dedoc/law_dataset | ---
license: mit
language:
- ru
size_categories:
- 10K<n<100K
---
Dataset for a lines classifier of [Russian laws](https://dedoc.readthedocs.io/en/latest/structure_types/law.html) |
Danieldlima21/Bocoyoutuber | ---
license: openrail
---
|
Abdullah44ali/auditing | ---
license: apache-2.0
---
|
PlanTL-GOB-ES/WikiCAT_esv2 | ---
YAML tags:
annotations_creators:
- automatically-generated
language_creators:
- found
language:
- es
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
pretty_name: wikicat_esv2
size_categories:
- unknown
source_datasets: []
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# WikiCAT_es: Spanish Text Classification dataset
## Dataset Description
- **Paper:**
- **Point of Contact:** carlos.rodriguez1@bsc.es
**Repository**
### Dataset Summary
WikiCAT_ca is a Spanish corpus for thematic Text Classification tasks. It is created automatically from Wikipedia and Wikidata sources, and contains 8401 articles from the Viquipedia classified under 12 different categories.
This dataset was developed by BSC TeMU as part of the PlanTL project, and intended as an evaluation of LT capabilities to generate useful synthetic corpus.
### Supported Tasks and Leaderboards
Text classification, Language Model
### Languages
ES- Spanish
## Dataset Structure
### Data Instances
Two json files, one for each split.
### Data Fields
We used a simple model with the article text and associated labels, without further metadata.
#### Example:
<pre>
{'sentence': 'La economía de Reunión se ha basado tradicionalmente en la agricultura. La caña de azúcar ha sido el cultivo principal durante más de un siglo, y en algunos años representa el 85% de las exportaciones. El gobierno ha estado impulsando el desarrollo de una industria turística para aliviar el alto desempleo, que representa más del 40% de la fuerza laboral.(...) El PIB total de la isla fue de 18.800 millones de dólares EE.UU. en 2007., 'label': 'Economía'}
</pre>
#### Labels
'Religión', 'Entretenimiento', 'Música', 'Ciencia_y_Tecnología', 'Política', 'Economía', 'Matemáticas', 'Humanidades', 'Deporte', 'Derecho', 'Historia', 'Filosofía'
### Data Splits
* hfeval_esv5.json: 1681 label-document pairs
* hftrain_esv5.json: 6716 label-document pairs
## Dataset Creation
### Methodology
La páginas de "Categoría" representan los temas.
para cada tema, extraemos las páginas asociadas a ese primer nivel de la jerarquía, y utilizamos el resúmen ("summary") como texto representativo.
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The source data are thematic categories in the different Wikipedias
#### Who are the source language producers?
### Annotations
#### Annotation process
Automatic annotation
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this corpus contributes to the development of language models in Spanish.
### Discussion of Biases
We are aware that this data might contain biases. We have not applied any steps to reduce their impact.
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es).
For further information, send an email to (plantl-gob-es@bsc.es).
This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://avancedigital.mineco.gob.es/en-us/Paginas/index.aspx) within the framework of the [Plan-TL](https://plantl.mineco.gob.es/Paginas/index.aspx).
### Licensing Information
This work is licensed under [CC Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) License.
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Contributions
[N/A]
|
varunr14/text2prompt | ---
license: unknown
---
|
GEM-submissions/Simon1997__bart-base_original_cacapo__1678442415 | ---
benchmark: gem
type: prediction
submission_name: BART-base_Original_CACAPO
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: BART-base_Original_CACAPO
|
ContractorQB/aimitz | ---
license: other
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-b6a817-2053667121 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-2.7b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-2.7b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
israfelsr/mm_tiny_imagenet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': n01443537
'1': n01629819
'2': n01641577
'3': n01644900
'4': n01698640
'5': n01742172
'6': n01768244
'7': n01770393
'8': n01774384
'9': n01774750
'10': n01784675
'11': n01882714
'12': n01910747
'13': n01917289
'14': n01944390
'15': n01950731
'16': n01983481
'17': n01984695
'18': n02002724
'19': n02056570
'20': n02058221
'21': n02074367
'22': n02094433
'23': n02099601
'24': n02099712
'25': n02106662
'26': n02113799
'27': n02123045
'28': n02123394
'29': n02124075
'30': n02125311
'31': n02129165
'32': n02132136
'33': n02165456
'34': n02226429
'35': n02231487
'36': n02233338
'37': n02236044
'38': n02268443
'39': n02279972
'40': n02281406
'41': n02321529
'42': n02364673
'43': n02395406
'44': n02403003
'45': n02410509
'46': n02415577
'47': n02423022
'48': n02437312
'49': n02480495
'50': n02481823
'51': n02486410
'52': n02504458
'53': n02509815
'54': n02666347
'55': n02669723
'56': n02699494
'57': n02769748
'58': n02788148
'59': n02791270
'60': n02793495
'61': n02795169
'62': n02802426
'63': n02808440
'64': n02814533
'65': n02814860
'66': n02815834
'67': n02823428
'68': n02837789
'69': n02841315
'70': n02843684
'71': n02883205
'72': n02892201
'73': n02909870
'74': n02917067
'75': n02927161
'76': n02948072
'77': n02950826
'78': n02963159
'79': n02977058
'80': n02988304
'81': n03014705
'82': n03026506
'83': n03042490
'84': n03085013
'85': n03089624
'86': n03100240
'87': n03126707
'88': n03160309
'89': n03179701
'90': n03201208
'91': n03255030
'92': n03355925
'93': n03373237
'94': n03388043
'95': n03393912
'96': n03400231
'97': n03404251
'98': n03424325
'99': n03444034
'100': n03447447
'101': n03544143
'102': n03584254
'103': n03599486
'104': n03617480
'105': n03637318
'106': n03649909
'107': n03662601
'108': n03670208
'109': n03706229
'110': n03733131
'111': n03763968
'112': n03770439
'113': n03796401
'114': n03814639
'115': n03837869
'116': n03838899
'117': n03854065
'118': n03891332
'119': n03902125
'120': n03930313
'121': n03937543
'122': n03970156
'123': n03977966
'124': n03980874
'125': n03983396
'126': n03992509
'127': n04008634
'128': n04023962
'129': n04070727
'130': n04074963
'131': n04099969
'132': n04118538
'133': n04133789
'134': n04146614
'135': n04149813
'136': n04179913
'137': n04251144
'138': n04254777
'139': n04259630
'140': n04265275
'141': n04275548
'142': n04285008
'143': n04311004
'144': n04328186
'145': n04356056
'146': n04366367
'147': n04371430
'148': n04376876
'149': n04398044
'150': n04399382
'151': n04417672
'152': n04456115
'153': n04465666
'154': n04486054
'155': n04487081
'156': n04501370
'157': n04507155
'158': n04532106
'159': n04532670
'160': n04540053
'161': n04560804
'162': n04562935
'163': n04596742
'164': n04598010
'165': n06596364
'166': n07056680
'167': n07583066
'168': n07614500
'169': n07615774
'170': n07646821
'171': n07647870
'172': n07657664
'173': n07695742
'174': n07711569
'175': n07715103
'176': n07720875
'177': n07749582
'178': n07753592
'179': n07768694
'180': n07871810
'181': n07873807
'182': n07875152
'183': n07920052
'184': n07975909
'185': n08496334
'186': n08620881
'187': n08742578
'188': n09193705
'189': n09246464
'190': n09256479
'191': n09332890
'192': n09428293
'193': n12267677
'194': n12520864
'195': n13001041
'196': n13652335
'197': n13652994
'198': n13719102
'199': n14991210
- name: caption
dtype: string
- name: label_name
dtype: string
splits:
- name: train
num_bytes: 159978960.0
num_examples: 80000
- name: validation
num_bytes: 40004701.0
num_examples: 20000
download_size: 149059401
dataset_size: 199983661.0
---
# Dataset Card for "mm_tiny_imagenet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
macavaney/d2q-msmarco-passage | ---
annotations_creators:
- no-annotation
language: []
language_creators:
- machine-generated
license: []
pretty_name: Doc2Query Generated Queries for `msmarco-passage`
source_datasets: [msmarco-passage]
tags:
- document-expansion
- doc2query
task_categories:
- text-retrieval
task_ids:
- document-retrieval
viewer: false
---
# Doc2Query Generated Queries for `msmarco-passage`
This dataset provides the pre-computed generated queries for the [`msmarco-passage`](https://ir-datasets.com/msmarco-passage) dataset,
for use when indexing Doc2Query.
The generated queries from from the T5 Doc2Query model, released by the original authors [here](https://github.com/castorini/docTTTTTquery).
## Getting started
This artefact is meant to be used with the [`pyterrier_doc2query`](https://github.com/terrierteam/pyterrier_doc2query) pacakge. It can
be installed as:
```bash
pip install git+https://github.com/terrierteam/pyterrier_doc2query
```
Depending on what you are using this aretefact for, you may also need the following additional package:
```bash
pip install git+https://github.com/terrierteam/pyterrier_pisa # for indexing / retrieval
```
## Using this artefact
The main use case is to use this aretefact in a Doc2Query indexing pipeline:
```python
import pyterrier as pt ; pt.init()
from pyterrier_pisa import PisaIndex
from pyterrier_doc2query import Doc2QueryStore
store = Doc2QueryStore.from_repo('https://huggingface.co/datasets/macavaney/d2q-msmarco-passage')
index = PisaIndex('path/to/index')
pipeline = store.generator(limit_k=40) >> index
dataset = pt.get_dataset('irds:msmarco-passage')
pipeline.index(dataset.get_corpus_iter())
```
You can also use the store directly as a dataset to look up or iterate over the data:
```python
store.lookup('100')
# {'querygen': ...}
for record in store:
pass
```
## Reproducing this aretefact
Due to the random nature of the Doc2Query generation process, this artefact cannot be reproduced verbatim.
This aretefact can be reproduced using the following pipeline:
The following runs Doc2Query inference over the MS MARCO dataset. It will not produce the artefact verbatim,
but should produce similar results when used for indexing/retrieval.
```python
import pyterrier as pt ; pt.init()
from pyterrier_doc2query import Doc2Query, Doc2QueryStore
doc2query = Doc2Query('macavaney/doc2query-t5-base-msmarco', num_samples=80)
store = Doc2QueryStore('path/to/store')
pipeline = doc2query >> store
dataset = pt.get_dataset('irds:msmarco-passage')
pipeline.index(dataset.get_corpus_iter())
```
Note that this process will take quite some time, since it generates 80 queries for every document in the dataset.
Alternatively, you could reproduce this artefact verbatim using the following script, but it doesn't perform
model inference; it just uses the pre-generated queries from the original authors.
```bash
wget https://git.uwaterloo.ca/jimmylin/doc2query-data/raw/master/T5-passage/predicted_queries_topk_sampling.zip
unzip predicted_queries_topk_sampling.zip
```
```python
from pyterrier_doc2query import Doc2QueryStore
import os
import ir_datasets
def iter_files(path):
i = 0
while os.path.exists(path.format(i)):
with open(path.format(i), 'rt') as fin:
for line in fin:
yield line.strip()
i += 1
def it():
file_iters = [iter_files('predicted_queries_topk_sample{:03}'.format(i)+'.txt{:03}-1004000') for i in range(80)]
for queries in enumerate(zip(*file_iters)):
yield {'docno': str(i), 'querygen': '\n'.join(queries)}
store = Doc2QueryStore('path/to/store')
store.index(it())
```
|
faisaltareque/multilingual-news-prompt | ---
dataset_info:
features:
- name: id
dtype: string
- name: headline
dtype: string
- name: article
dtype: string
- name: lang
dtype: string
- name: image_caption_separated
dtype: string
- name: topic_word_separated
dtype: string
- name: image_based_top_3
dtype: string
- name: caption_based_top_3
dtype: string
- name: image_based_top_5
dtype: string
- name: caption_based_top_5
dtype: string
- name: image_based_top_10
dtype: string
- name: caption_based_top_10
dtype: string
- name: image_based_top_15
dtype: string
- name: caption_based_top_15
dtype: string
- name: topic_word_separated_new
dtype: string
- name: topic_word_count_new
dtype: int64
- name: prompt_type
dtype: string
- name: article_prompt
dtype: string
splits:
- name: train
num_bytes: 9136949083
num_examples: 394353
- name: valid
num_bytes: 121366337
num_examples: 5187
- name: test
num_bytes: 358666498
num_examples: 15577
download_size: 5317632829
dataset_size: 9616981918
---
# Dataset Card for "multilingual-news-prompt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daviddudas/invoices_v2 | ---
license: unknown
---
|
GEM/wiki_auto_asset_turk | ---
annotations_creators:
- crowd-sourced
language_creators:
- unknown
language:
- en
license:
- other
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- text2text-generation
task_ids:
- text-simplification
pretty_name: wiki_auto_asset_turk
---
# Dataset Card for GEM/wiki_auto_asset_turk
## Dataset Description
- **Homepage:** n/a
- **Repository:** https://github.com/chaojiang06/wiki-auto, [ASSET repository
- **Paper:** https://aclanthology.org/2020.acl-main.709/, [ASSET
- **Leaderboard:** N/A
- **Point of Contact:** WikiAuto: Chao Jiang; ASSET: Fernando Alva-Manchego and Louis Martin; TURK: Wei Xu
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/wiki_auto_asset_turk).
### Dataset Summary
WikiAuto is an English simplification dataset that we paired with ASSET and TURK, two very high-quality evaluation datasets, as test sets. The input is an English sentence taken from Wikipedia and the target a simplified sentence. ASSET and TURK contain the same test examples but have references that are simplified in different ways (splitting sentences vs. rewriting and splitting).
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/wiki_auto_asset_turk')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/wiki_auto_asset_turk).
#### website
n/a
#### paper
[WikiAuto](https://aclanthology.org/2020.acl-main.709/), [ASSET](https://aclanthology.org/2020.acl-main.424/), [TURK](https://aclanthology.org/Q16-1029/)
#### authors
WikiAuto: Chao Jiang, Mounica Maddela, Wuwei Lan, Yang Zhong, Wei Xu; ASSET: Fernando Alva-Manchego, Louis Martin, Antoine Bordes, Carolina Scarton, and Benoîıt Sagot, and Lucia Specia; TURK: Wei Xu, Courtney Napoles, Ellie Pavlick, Quanze Chen, and Chris Callison-Burch
## Dataset Overview
### Where to find the Data and its Documentation
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Wiki-Auto repository](https://github.com/chaojiang06/wiki-auto), [ASSET repository](https://github.com/facebookresearch/asset), [TURKCorpus](https://github.com/cocoxu/simplification)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[WikiAuto](https://aclanthology.org/2020.acl-main.709/), [ASSET](https://aclanthology.org/2020.acl-main.424/), [TURK](https://aclanthology.org/Q16-1029/)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
WikiAuto:
```
@inproceedings{jiang-etal-2020-neural,
title = "Neural {CRF} Model for Sentence Alignment in Text Simplification",
author = "Jiang, Chao and
Maddela, Mounica and
Lan, Wuwei and
Zhong, Yang and
Xu, Wei",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.709",
doi = "10.18653/v1/2020.acl-main.709",
pages = "7943--7960",
}
```
ASSET:
```
@inproceedings{alva-manchego-etal-2020-asset,
title = "{ASSET}: {A} Dataset for Tuning and Evaluation of Sentence Simplification Models with Multiple Rewriting Transformations",
author = "Alva-Manchego, Fernando and
Martin, Louis and
Bordes, Antoine and
Scarton, Carolina and
Sagot, Beno{\^\i}t and
Specia, Lucia",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.424",
pages = "4668--4679",
}
```
TURK:
```
@article{Xu-EtAl:2016:TACL,
author = {Wei Xu and Courtney Napoles and Ellie Pavlick and Quanze Chen and Chris Callison-Burch},
title = {Optimizing Statistical Machine Translation for Text Simplification},
journal = {Transactions of the Association for Computational Linguistics},
volume = {4},
year = {2016},
url = {https://cocoxu.github.io/publications/tacl2016-smt-simplification.pdf},
pages = {401--415}
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
WikiAuto: Chao Jiang; ASSET: Fernando Alva-Manchego and Louis Martin; TURK: Wei Xu
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
jiang.1530@osu.edu, f.alva@sheffield.ac.uk, louismartincs@gmail.com, wei.xu@cc.gatech.edu
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### Whose Language?
<!-- info: Whose language is in the dataset? -->
<!-- scope: periscope -->
Wiki-Auto contains English text only (BCP-47: `en`). It is presented as a translation task where Wikipedia Simple English is treated as its own idiom. For a statement of what is intended (but not always observed) to constitute Simple English on this platform, see [Simple English in Wikipedia](https://simple.wikipedia.org/wiki/Wikipedia:About#Simple_English).
Both ASSET and TURK use crowdsourcing to change references, and their language is thus a combination of the WikiAuto data and the language of the demographic on mechanical Turk
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
other: Other license
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
WikiAuto provides a set of aligned sentences from English Wikipedia and Simple English Wikipedia as a resource to train sentence simplification systems.
The authors first crowd-sourced a set of manual alignments between sentences in a subset of the Simple English Wikipedia and their corresponding versions in English Wikipedia (this corresponds to the `manual` config in this version of the dataset), then trained a neural CRF system to predict these alignments.
The trained alignment prediction model was then applied to the other articles in Simple English Wikipedia with an English counterpart to create a larger corpus of aligned sentences (corresponding to the `auto` and `auto_acl` configs here).
[ASSET](https://github.com/facebookresearch/asset) [(Alva-Manchego et al., 2020)](https://www.aclweb.org/anthology/2020.acl-main.424.pdf) is multi-reference dataset for the evaluation of sentence simplification in English. The dataset uses the same 2,359 sentences from [TurkCorpus](https://github.com/cocoxu/simplification/) [(Xu et al., 2016)](https://www.aclweb.org/anthology/Q16-1029.pdf) and each sentence is associated with 10 crowdsourced simplifications. Unlike previous simplification datasets, which contain a single transformation (e.g., lexical paraphrasing in TurkCorpus or sentence
splitting in [HSplit](https://www.aclweb.org/anthology/D18-1081.pdf)), the simplifications in ASSET encompass a variety of rewriting transformations.
TURKCorpus is a high quality simplification dataset where each source (not simple) sentence is associated with 8 human-written simplifications that focus on lexical paraphrasing. It is one of the two evaluation datasets for the text simplification task in GEM. It acts as the validation and test set for paraphrasing-based simplification that does not involve sentence splitting and deletion.
#### Add. License Info
<!-- info: What is the 'other' license of the dataset? -->
<!-- scope: periscope -->
WikiAuto: `CC BY-NC 3.0`, ASSET: `CC BY-NC 4.0`, TURK: `GNU General Public License v3.0`
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Simplification
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
The goal is to communicate the main ideas of source sentence in a way that is easier to understand by non-native speakers of English.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`, `industry`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
Ohio State University, University of Sheffield, Inria, Facebook AI Research, Imperial College London, University of Pennsylvania, John Hopkins University
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
WikiAuto: Chao Jiang, Mounica Maddela, Wuwei Lan, Yang Zhong, Wei Xu; ASSET: Fernando Alva-Manchego, Louis Martin, Antoine Bordes, Carolina Scarton, and Benoîıt Sagot, and Lucia Specia; TURK: Wei Xu, Courtney Napoles, Ellie Pavlick, Quanze Chen, and Chris Callison-Burch
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
WikiAuto: NSF, ODNI, IARPA, Figure Eight AI, and Criteo. ASSET: PRAIRIE Institute, ANR. TURK: NSF
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
GEM v1 had separate data cards for WikiAuto, ASSET, and TURK. They were contributed by Dhruv Kumar and Mounica Maddela. The initial data loader was written by Yacine Jernite. Sebastian Gehrmann merged and extended the data cards and migrated the loader to the v2 infrastructure.
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
- `source`: A source sentence from one of the datasets
- `target`: A single simplified sentence corresponding to `source`
- `references`: In the case of ASSET/TURK, references is a list of strings corresponding to the different references.
#### Reason for Structure
<!-- info: How was the dataset structure determined? -->
<!-- scope: microscope -->
The underlying datasets have extensive secondary annotations that can be used in conjunction with the GEM version. We omit those annotations to simplify the format into one that can be used by seq2seq models.
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{
'source': 'In early work, Rutherford discovered the concept of radioactive half-life , the radioactive element radon, and differentiated and named alpha and beta radiation .',
'target': 'Rutherford discovered the radioactive half-life, and the three parts of radiation which he named Alpha, Beta, and Gamma.'
}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
In WikiAuto, which is used as training and validation set, the following splits are provided:
| | Tain | Dev | Test |
| ----- | ------ | ----- | ---- |
| Total sentence pairs | 373801 | 73249 | 118074 |
| Aligned sentence pairs | 1889 | 346 | 677 |
ASSET does not contain a training set; many models use [WikiLarge](https://github.com/XingxingZhang/dress) (Zhang and Lapata, 2017) for training. For GEM, [Wiki-Auto](https://github.com/chaojiang06/wiki-auto) will be used for training the model.
Each input sentence has 10 associated reference simplified sentences. The statistics of ASSET are given below.
| | Dev | Test | Total |
| ----- | ------ | ---- | ----- |
| Input Sentences | 2000 | 359 | 2359 |
| Reference Simplifications | 20000 | 3590 | 23590 |
The test and validation sets are the same as those of [TurkCorpus](https://github.com/cocoxu/simplification/). The split was random.
There are 19.04 tokens per reference on average (lower than 21.29 and 25.49 for TurkCorpus and HSplit, respectively). Most (17,245) of the referece sentences do not involve sentence splitting.
TURKCorpus does not contain a training set; many models use [WikiLarge](https://github.com/XingxingZhang/dress) (Zhang and Lapata, 2017) or [Wiki-Auto](https://github.com/chaojiang06/wiki-auto) (Jiang et. al 2020) for training.
Each input sentence has 8 associated reference simplified sentences. 2,359 input sentences are randomly split into 2,000 validation and 359 test sentences.
| | Dev | Test | Total |
| ----- | ------ | ---- | ----- |
| Input Sentences | 2000 | 359 | 2359 |
| Reference Simplifications | 16000 | 2872 | 18872 |
There are 21.29 tokens per reference on average.
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
In our setup, we use WikiAuto as training/validation corpus and ASSET and TURK as test corpora. ASSET and TURK have the same inputs but differ in their reference style. Researchers can thus conduct targeted evaluations based on the strategies that a model should learn.
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
WikiAuto is the largest open text simplification dataset currently available. ASSET and TURK are high quality test sets that are compatible with WikiAuto.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
It's unique setup with multiple test sets makes the task interesting since it allows for evaluation of multiple generations and systems that simplify in different ways.
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
simplification
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
yes
#### GEM Modifications
<!-- info: What changes have been made to he original dataset? -->
<!-- scope: periscope -->
`other`
#### Modification Details
<!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification -->
<!-- scope: microscope -->
We removed secondary annotations and focus on the simple `input->output` format, but combine the different sub-datasets.
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
yes
#### Split Information
<!-- info: Describe how the new splits were created -->
<!-- scope: periscope -->
we split the original test set according to syntactic complexity of the source sentences. To characterize sentence syntactic complexity, we use the 8-level developmental level (d-level) scale proposed by [Covington et al. (2006)](https://www.researchgate.net/publication/254033869_How_complex_is_that_sentence_A_proposed_revision_of_the_Rosenberg_and_Abbeduto_D-Level_Scale) and the implementation of [Lu, Xiaofei (2010)](https://www.jbe-platform.com/content/journals/10.1075/ijcl.15.4.02lu).
We thus split the original test set into 8 subsets corresponding to the 8 d-levels assigned to source sentences. We obtain the following number of instances per level and average d-level of the dataset:
| Total nb. sentences | L0 | L1 | L2 | L3 | L4 | L5 | L6 | L7 | Mean Level |
|-------------------- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ---------- |
| 359 | 166 | 0 | 58 | 32 | 5 | 28 | 7 | 63 | 2.38 |
#### Split Motivation
<!-- info: What aspects of the model's generation capacities were the splits created to test? -->
<!-- scope: periscope -->
The goal was to assess performance when simplifying source sentences with different syntactic structure and complexity.
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
There are recent supervised ([Martin et al., 2019](https://arxiv.org/abs/1910.02677), [Kriz et al., 2019](https://www.aclweb.org/anthology/N19-1317/), [Dong et al., 2019](https://www.aclweb.org/anthology/P19-1331/), [Zhang and Lapata, 2017](https://www.aclweb.org/anthology/D17-1062/)) and unsupervised ([Martin et al., 2020](https://arxiv.org/abs/2005.00352v1), [Kumar et al., 2020](https://www.aclweb.org/anthology/2020.acl-main.707/), [Surya et al., 2019](https://www.aclweb.org/anthology/P19-1198/)) text simplification models that can be used as baselines.
#### Technical Terms
<!-- info: Technical terms used in this card and the dataset and their definitions -->
<!-- scope: microscope -->
The common metric used for automatic evaluation is SARI [(Xu et al., 2016)](https://www.aclweb.org/anthology/Q16-1029/).
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
Simplification
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`Other: Other Metrics`, `BLEU`
#### Other Metrics
<!-- info: Definitions of other metrics -->
<!-- scope: periscope -->
SARI: A simplification metric that considers both input and references to measure the "goodness" of words that are added, deleted, and kept.
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
The original authors of WikiAuto and ASSET used human evaluation to assess the fluency, adequacy, and simplicity (details provided in the paper). For TURK, the authors measured grammaticality, meaning-preservation, and simplicity gain (details in the paper).
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
no
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
Wiki-Auto provides a new version of the Wikipedia corpus that is larger, contains 75% less defective pairs and has more complex rewrites than the previous WIKILARGE dataset.
ASSET was created in order to improve the evaluation of sentence simplification. It uses the same input sentences as the [TurkCorpus](https://github.com/cocoxu/simplification/) dataset from [(Xu et al., 2016)](https://www.aclweb.org/anthology/Q16-1029.pdf). The 2,359 input sentences of TurkCorpus are a sample of "standard" (not simple) sentences from the [Parallel Wikipedia Simplification (PWKP)](https://www.informatik.tu-darmstadt.de/ukp/research_6/data/sentence_simplification/simple_complex_sentence_pairs/index.en.jsp) dataset [(Zhu et al., 2010)](https://www.aclweb.org/anthology/C10-1152.pdf), which come from the August 22, 2009 version of Wikipedia. The sentences of TurkCorpus were chosen to be of similar length [(Xu et al., 2016)](https://www.aclweb.org/anthology/Q16-1029.pdf). No further information is provided on the sampling strategy.
The TurkCorpus dataset was developed in order to overcome some of the problems with sentence pairs from Standard and Simple Wikipedia: a large fraction of sentences were misaligned, or not actually simpler [(Xu et al., 2016)](https://www.aclweb.org/anthology/Q16-1029.pdf). However, TurkCorpus mainly focused on *lexical paraphrasing*, and so cannot be used to evaluate simplifications involving *compression* (deletion) or *sentence splitting*. HSplit [(Sulem et al., 2018)](https://www.aclweb.org/anthology/D18-1081.pdf), on the other hand, can only be used to evaluate sentence splitting. The reference sentences in ASSET include a wider variety of sentence rewriting strategies, combining splitting, compression and paraphrasing. Annotators were given examples of each kind of transformation individually, as well as all three transformations used at once, but were allowed to decide which transformations to use for any given sentence.
An example illustrating the differences between TurkCorpus, HSplit and ASSET is given below:
> **Original:** He settled in London, devoting himself chiefly to practical teaching.
>
> **TurkCorpus:** He rooted in London, devoting himself mainly to practical teaching.
>
> **HSplit:** He settled in London. He devoted himself chiefly to practical teaching.
>
> **ASSET:** He lived in London. He was a teacher.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
The goal is to communicate the same information as the source sentence using simpler words and grammar.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
yes
#### Source Details
<!-- info: List the sources (one per line) -->
<!-- scope: periscope -->
Wikipedia
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Found`
#### Where was it found?
<!-- info: If found, where from? -->
<!-- scope: telescope -->
`Single website`
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
The dataset uses language from Wikipedia: some demographic information is provided [here](https://en.wikipedia.org/wiki/Wikipedia:Who_writes_Wikipedia%3F).
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
not validated
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
algorithmically
#### Filter Criteria
<!-- info: What were the selection criteria? -->
<!-- scope: microscope -->
The authors mention that they "extracted 138,095 article pairs from the 2019/09 Wikipedia dump using an improved version of the [WikiExtractor](https://github.com/attardi/wikiextractor) library". The [SpaCy](https://spacy.io/) library is used for sentence splitting.
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
crowd-sourced
#### Number of Raters
<!-- info: What is the number of raters -->
<!-- scope: telescope -->
11<n<50
#### Rater Qualifications
<!-- info: Describe the qualifications required of an annotator. -->
<!-- scope: periscope -->
WikiAuto (Figure Eight): No information provided.
ASSET (MTurk):
- Having a HIT approval rate over 95%, and over 1000 HITs approved. No other demographic or compensation information is provided.
- Passing a Qualification Test (appropriately simplifying sentences). Out of 100 workers, 42 passed the test.
- Being a resident of the United States, United Kingdom or Canada.
TURK (MTurk):
- Reference sentences were written by workers with HIT approval rate over 95%. No other demographic or compensation information is provided.
#### Raters per Training Example
<!-- info: How many annotators saw each training example? -->
<!-- scope: periscope -->
1
#### Raters per Test Example
<!-- info: How many annotators saw each test example? -->
<!-- scope: periscope -->
>5
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
yes
#### Which Annotation Service
<!-- info: Which annotation services were used? -->
<!-- scope: periscope -->
`Amazon Mechanical Turk`, `Appen`
#### Annotation Values
<!-- info: Purpose and values for each annotation -->
<!-- scope: microscope -->
WikiAuto: Sentence alignment labels were crowdsourced for 500 randomly sampled document pairs (10,123 sentence pairs total). The authors pre-selected several alignment candidates from English Wikipedia for each Simple Wikipedia sentence based on various similarity metrics, then asked the crowd-workers to annotate these pairs. Finally, they trained their alignment model on this manually annotated dataset to obtain automatically aligned sentences (138,095 document pairs, 488,332 sentence pairs).
No demographic annotation is provided for the crowd workers. The [Figure Eight](https://www.figure-eight.com/) platform now part of Appen) was used for the annotation process.
ASSET: The instructions given to the annotators are available [here](https://github.com/facebookresearch/asset/blob/master/crowdsourcing/AMT_AnnotationInstructions.pdf).
TURK: The references are crowdsourced from Amazon Mechanical Turk. The annotators were asked to provide simplifications without losing any information or splitting the input sentence. No other demographic or compensation information is provided in the TURKCorpus paper. The instructions given to the annotators are available in the paper.
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
none
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
yes
#### Consent Policy Details
<!-- info: What was the consent policy? -->
<!-- scope: microscope -->
Both Figure Eight and Amazon Mechanical Turk raters forfeit the right to their data as part of their agreements.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
#### Justification for no PII
<!-- info: Provide a justification for selecting `no PII` above. -->
<!-- scope: periscope -->
Since the dataset is created from Wikipedia/Simple Wikipedia, all the information contained in the dataset is already in the public domain.
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
yes
#### Links and Summaries of Analysis Work
<!-- info: Provide links to and summaries of works analyzing these biases. -->
<!-- scope: microscope -->
The dataset may contain some social biases, as the input sentences are based on Wikipedia. Studies have shown that the English Wikipedia contains both gender biases [(Schmahl et al., 2020)](https://research.tudelft.nl/en/publications/is-wikipedia-succeeding-in-reducing-gender-bias-assessing-changes) and racial biases [(Adams et al., 2019)](https://journals.sagepub.com/doi/pdf/10.1177/2378023118823946).
## Considerations for Using the Data
### PII Risks and Liability
#### Potential PII Risk
<!-- info: Considering your answers to the PII part of the Data Curation Section, describe any potential privacy to the data subjects and creators risks when using the dataset. -->
<!-- scope: microscope -->
All the data is in the public domain.
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
The dataset may contain some social biases, as the input sentences are based on Wikipedia. Studies have shown that the English Wikipedia contains both gender biases [(Schmahl et al., 2020)](https://research.tudelft.nl/en/publications/is-wikipedia-succeeding-in-reducing-gender-bias-assessing-changes) and racial biases [(Adams et al., 2019)](https://journals.sagepub.com/doi/pdf/10.1177/2378023118823946).
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
Since the test datasets contains only 2,359 sentences that are derived from Wikipedia, they are limited to a small subset of topics present on Wikipedia.
|
Atipico1/NQ-20k_preprocessed_with_o-u_case | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 187754303
num_examples: 20000
- name: test
num_bytes: 34159853
num_examples: 3610
download_size: 126702831
dataset_size: 221914156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Maxlinn/LLaVA-Pretrain_Descriptive-Captions | ---
license: cc-by-sa-4.0
---
# LLaVA-Pretrain_Descriptive-Captions
A work of Maxlinn([林知](https://zhihu.com/people/lin-zhi-nlp)), please give credits if you like this work :)
Inspired by [DALLE-3 paper](https://cdn.openai.com/papers/dall-e-3.pdf), descriptive captions are much useful for Text-to-Image models(and possibly Language-Vision Language Models).
We recaptioned LLaVA's pretraining image-text pairs [blip_laion_cc_sbu_558k.json](https://huggingface.co/datasets/liuhaotian/LLaVA-Pretrain/blob/main/blip_laion_cc_sbu_558k.json) using LLaVA-v1.5-13B. It takes about 48 hours on 16 high-end gpus.
## Usage
Can be used as a drop-in replace of `blip_laion_cc_sbu_558k.json`.
The order of examples, ids, image paths, human questions are all the same. The only difference is the caption in the gpt's turn.
## Example
The original caption case be seen in the prompt.

## Generation Process
To keep the generated descriptive captions faithful but diverse, we use the following user instruction and sampling arguments:
user instructions: asked gpt-4 to write.
```
Please provide a detailed and objective description of the image based on the caption "{short_caption}", focusing only on elements that are fully visible. Do not include any inaccurate, emotional or subjective interpretations. Describe the objects, colors, shapes, and arrangement in the image.
```
sampling arguments: the same as gradio demo of LLaVA-v1.5.
- model precision: fp16
- temperature: 0.2
- max_new_tokens: 512
- top_p: 0.7
## Seen Bias
- `llava-v1.5-13b` loves to use the pattern `the image features...` to describe a image.
- `llava-v1.5-13b` may make some errors in counting and describing texts. |
Bench4CO/TSP-Dataset | ---
language:
- en
tags:
- combinatorial-optimization
size_categories:
- 100M<n<1B
---
# TSP Dataset
## Dataset Description
The TSP (Traveling Salesman Problem) dataset is a comprehensive collection of instances specifically designed for studying and solving the TSP, a classic combinatorial optimization problem. The objective of the TSP is to find the shortest possible route for a traveling salesman to visit a set of cities and return to the starting city, while visiting each city exactly once.
## Update
- December 6, 2023 |
bot-yaya/un_pdf_6347_v2 | ---
dataset_info:
features:
- name: zh
dtype: string
- name: en
dtype: string
- name: fr
dtype: string
- name: es
dtype: string
- name: ru
dtype: string
- name: record
dtype: string
splits:
- name: train
num_bytes: 1704689239
num_examples: 6347
download_size: 811566117
dataset_size: 1704689239
---
# Dataset Card for "un_pdf_random9208_preprocessed_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Reccamike23/CLAVIS_FURNITURE | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 85772817.0
num_examples: 72
download_size: 84810707
dataset_size: 85772817.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sargishunanyan/thermostats | ---
task_categories:
- image-segmentation
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="sargishunanyan/thermostats" src="https://huggingface.co/datasets/sargishunanyan/thermostats/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['housing', 'thermostat']
```
### Number of Images
```json
{'valid': 35, 'test': 18, 'train': 123}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("sargishunanyan/thermostats", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/yolo-po0ro/thermo-part-3/dataset/1](https://universe.roboflow.com/yolo-po0ro/thermo-part-3/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ thermo-part-3_dataset,
title = { Thermo, part 3 Dataset },
type = { Open Source Dataset },
author = { Yolo },
howpublished = { \\url{ https://universe.roboflow.com/yolo-po0ro/thermo-part-3 } },
url = { https://universe.roboflow.com/yolo-po0ro/thermo-part-3 },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { oct },
note = { visited on 2023-10-18 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on October 16, 2023 at 4:27 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 176 images.
Thermostats are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
No image augmentation techniques were applied.
|
sxu/CANLI | ---
license: afl-3.0
annotations_creators:
- expert-generated
language:
- cn
language_creators:
- expert-generated
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
---
# Dataset Card for CANLI
### Dataset Summary
[CANLI: The Chinese Causative-Passive Homonymy Disambiguation: an Adversarial Dataset for NLI and a Probing Task](http://www.lrec-conf.org/proceedings/lrec2022/pdf/2022.lrec-1.460.pdf)
The disambiguation of causative-passive homonymy (CPH) is potentially tricky for machines, as the causative and the passive
are not distinguished by the sentences syntactic structure. By transforming CPH disambiguation to a challenging natural
language inference (NLI) task, we present the first Chinese Adversarial NLI challenge set (CANLI). We show that the pretrained
transformer model RoBERTa, fine-tuned on an existing large-scale Chinese NLI benchmark dataset, performs poorly on CANLI.
We also employ Word Sense Disambiguation as a probing task to investigate to what extent the CPH feature is captured in
the models internal representation. We find that the models performance on CANLI does not correspond to its internal
representation of CPH, which is the crucial linguistic ability central to the CANLI dataset.
### Languages
Chinese Mandarin
# Citation Information
@inproceedings{xu-markert-2022-chinese,
title = "The {C}hinese Causative-Passive Homonymy Disambiguation: an adversarial Dataset for {NLI} and a Probing Task",
author = "Xu, Shanshan and Markert, Katja",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.460",
pages = "4316--4323",
}
|
lmms-lab/VQAv2 | ---
license: cc-by-4.0
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: image_id
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
splits:
- name: validation
num_bytes: 33693404566.41
num_examples: 214354
- name: testdev
num_bytes: 17592305340.906
num_examples: 107394
- name: test
num_bytes: 71407026207.344
num_examples: 447793
download_size: 44780405115
dataset_size: 190384873283.36398
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: testdev
path: data/testdev-*
- split: test
path: data/test-*
---
|
CyberHarem/hidaka_ai_theidolmster | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hidaka_ai (THE iDOLM@STER)
This is the dataset of hidaka_ai (THE iDOLM@STER), containing 248 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, antenna_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 248 | 154.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hidaka_ai_theidolmster/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 248 | 120.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hidaka_ai_theidolmster/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 399 | 200.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hidaka_ai_theidolmster/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 248 | 146.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hidaka_ai_theidolmster/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 399 | 237.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hidaka_ai_theidolmster/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hidaka_ai_theidolmster',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, smile, open_mouth, cute_&_girly_(idolmaster), solo, blush, gloves |
| 1 | 7 |  |  |  |  |  | 1girl, hoodie, solo, open_mouth, smile |
| 2 | 7 |  |  |  |  |  | 1girl, cleavage, solo, medium_breasts, navel, smile, pink_bikini, side-tie_bikini_bottom |
| 3 | 7 |  |  |  |  |  | 1girl, hetero, nipples, penis, solo_focus, 1boy, blush, medium_breasts, open_mouth, sex, vaginal, cum_in_pussy, bar_censor, girl_on_top, mosaic_censoring, nude, straddling, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | open_mouth | cute_&_girly_(idolmaster) | solo | blush | gloves | hoodie | cleavage | medium_breasts | navel | pink_bikini | side-tie_bikini_bottom | hetero | nipples | penis | solo_focus | 1boy | sex | vaginal | cum_in_pussy | bar_censor | girl_on_top | mosaic_censoring | nude | straddling | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:----------------------------|:-------|:--------|:---------|:---------|:-----------|:-----------------|:--------|:--------------|:-------------------------|:---------|:----------|:--------|:-------------|:-------|:------|:----------|:---------------|:-------------|:--------------|:-------------------|:-------|:-------------|:--------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
damerajee/en-kannada | ---
license: apache-2.0
---
|
scribis/Wikipedia-it-Trame-di-Romanzi | ---
license: cc-by-nc-2.0
language:
- it
tags:
- wikipedia
---
Raccolta di trame di romanzi da Wikipedia italiana (aprile 2024) |
senhorsapo/raphael | ---
license: openrail
---
|
tilyupo/qa2a | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: correct_answer
dtype: string
- name: wrong_answers
dtype: string
splits:
- name: train
num_bytes: 78996049.42025253
num_examples: 391907
- name: validation
num_bytes: 8319005.87504651
num_examples: 41325
download_size: 52850487
dataset_size: 87315055.29529904
---
# Dataset Card for "qa2a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shawt/liz | ---
license: openrail
tags:
- art
- lizz
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AI4FinTech/ellipticpp | ---
license: unknown
---
|
ameemazainab/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 0
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713086545 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2438640
num_examples: 7119
download_size: 1409449
dataset_size: 2438640
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yuan-sf63/word_label_0.2_72_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
splits:
- name: train
num_bytes: 50006440.62244486
num_examples: 71326
- name: validation
num_bytes: 5556894.37755514
num_examples: 7926
download_size: 9720462
dataset_size: 55563335.0
---
# Dataset Card for "word_label_0.2_72_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_indic-te_wikisource | ---
language: te
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-te_wikisource
# wikisource_filtered
- Dataset uid: `wikisource_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 2.6306 % of total
- 12.7884 % of fr
- 19.8886 % of indic-bn
- 20.9966 % of indic-ta
- 2.3478 % of ar
- 4.7068 % of indic-hi
- 18.0998 % of indic-te
- 1.7155 % of es
- 19.4800 % of indic-kn
- 9.1737 % of indic-ml
- 17.1771 % of indic-mr
- 17.1870 % of indic-gu
- 70.3687 % of indic-as
- 1.0165 % of pt
- 7.8642 % of indic-pa
- 1.3501 % of vi
- 4.9411 % of indic-or
- 0.5307 % of ca
- 2.3593 % of id
- 1.5928 % of eu
### BigScience processing steps
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- remove_wiki_mojibake
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
|
itssid/EHS_CUSTOM | ---
license: unknown
---
|
autoevaluate/autoeval-eval-futin__feed-sen_vi_-0f1239-2245871651 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: futin/feed
* Config: sen_vi_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
hc95qc/embeddings | ---
license: cc
---
|
liuyanchen1015/MULTI_VALUE_sst2_never_negator | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 299
num_examples: 2
- name: test
num_bytes: 1136
num_examples: 8
- name: train
num_bytes: 16506
num_examples: 144
download_size: 12868
dataset_size: 17941
---
# Dataset Card for "MULTI_VALUE_sst2_never_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-platypus-13b-0.10e | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-orca-platypus-13b-0.10e
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-orca-platypus-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-platypus-13b-0.10e)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-platypus-13b-0.10e\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T14:54:29.987056](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-platypus-13b-0.10e/blob/main/results_2023-10-24T14-54-29.987056.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 8.913590604026845e-05,\n \"f1_stderr\"\
: 2.996167513080367e-05,\n \"acc\": 0.24861878453038674,\n \"acc_stderr\"\
: 0.007026135605808221\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 8.913590604026845e-05,\n \"\
f1_stderr\": 2.996167513080367e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616441\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-orca-platypus-13b-0.10e
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|arc:challenge|25_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T17_09_41.931905
path:
- '**/details_harness|drop|3_2023-10-17T17-09-41.931905.parquet'
- split: 2023_10_24T14_54_29.987056
path:
- '**/details_harness|drop|3_2023-10-24T14-54-29.987056.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T14-54-29.987056.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T17_09_41.931905
path:
- '**/details_harness|gsm8k|5_2023-10-17T17-09-41.931905.parquet'
- split: 2023_10_24T14_54_29.987056
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-54-29.987056.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-54-29.987056.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hellaswag|10_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T08:11:13.966337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-48-20.175227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T08:11:13.966337.parquet'
- split: 2023_09_12T14_48_20.175227
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-48-20.175227.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-48-20.175227.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T17_09_41.931905
path:
- '**/details_harness|winogrande|5_2023-10-17T17-09-41.931905.parquet'
- split: 2023_10_24T14_54_29.987056
path:
- '**/details_harness|winogrande|5_2023-10-24T14-54-29.987056.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T14-54-29.987056.parquet'
- config_name: results
data_files:
- split: 2023_09_04T08_11_13.966337
path:
- results_2023-09-04T08:11:13.966337.parquet
- split: 2023_09_12T14_48_20.175227
path:
- results_2023-09-12T14-48-20.175227.parquet
- split: 2023_10_17T17_09_41.931905
path:
- results_2023-10-17T17-09-41.931905.parquet
- split: 2023_10_24T14_54_29.987056
path:
- results_2023-10-24T14-54-29.987056.parquet
- split: latest
path:
- results_2023-10-24T14-54-29.987056.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-platypus-13b-0.10e
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-orca-platypus-13b-0.10e
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-platypus-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-platypus-13b-0.10e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-platypus-13b-0.10e",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T14:54:29.987056](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-platypus-13b-0.10e/blob/main/results_2023-10-24T14-54-29.987056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 8.913590604026845e-05,
"f1_stderr": 2.996167513080367e-05,
"acc": 0.24861878453038674,
"acc_stderr": 0.007026135605808221
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 8.913590604026845e-05,
"f1_stderr": 2.996167513080367e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CodeTranslatorLLM/Code-Translation | ---
license: mit
---
|
CVasNLPExperiments/fairness_firefighter_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: scores
sequence: float64
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2480232
num_examples: 4800
download_size: 183869
dataset_size: 2480232
---
# Dataset Card for "fairness_firefighter_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nihaomur/breeze7B_tokenized_med | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1089133996.6720784
num_examples: 639794
- name: validation
num_bytes: 272284350.32792157
num_examples: 159949
download_size: 625233880
dataset_size: 1361418347.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
communityai/Telugu-LLM-Labs___assamese_alpaca_yahma_cleaned_filtered | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 80191536.0
num_examples: 28910
download_size: 27904025
dataset_size: 80191536.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hugenluc/testembedding | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.