datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Isotonic/DialogSumm | ---
language:
- en
license: cc-by-nc-sa-4.0
size_categories:
- 10K<n<100K
task_categories:
- summarization
- text-generation
- text2text-generation
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 48177311.0
num_examples: 52480
download_size: 29232356
dataset_size: 48177311.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## DialogSumm
DialogSumm is a mixture of the following dialog datasets:
- [dialogsum](https://huggingface.co/datasets/knkarthick/dialogsum)
- [samsum](https://huggingface.co/datasets/samsum)
- [MocktaiLEngineer/qmsum-processed](https://huggingface.co/datasets/MocktaiLEngineer/qmsum-processed)
- [npc-engine/light-batch-summarize-dialogue](https://huggingface.co/datasets/npc-engine/light-batch-summarize-dialogue)
## 💻 Usage
```
from datasets import load_dataset
dataset = load_dataset("Isotonic/DialogSumm")
```
🚀🚀 Next: DialogSumm + [cnn_dailymail](https://huggingface.co/datasets/cnn_dailymail) + [mediasum](https://huggingface.co/datasets/ccdv/mediasum) + [EdinburghNLP/xsum](https://huggingface.co/datasets/EdinburghNLP/xsum) |
alvarobartt/zephyr-7b-beta-judgelm | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_prompt
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
struct:
- name: choices
list:
- name: finish_reason
dtype: string
- name: index
dtype: int64
- name: message
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: created
dtype: int64
- name: id
dtype: string
- name: model
dtype: string
- name: object
dtype: string
- name: usage
struct:
- name: completion_tokens
dtype: int64
- name: prompt_tokens
dtype: int64
- name: total_tokens
dtype: int64
- name: ratings
sequence: int64
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 1409108
num_examples: 100
download_size: 455420
dataset_size: 1409108
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zephyr-7b-beta-judgelm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joaogabrielcasanova/kauan3.0 | ---
license: openrail
---
|
Multimodal-Fatima/DTD_parition1_test_facebook_opt_1.3b_Attributes_Caption_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 91757391.0
num_examples: 1880
- name: fewshot_1_bs_16
num_bytes: 92256063.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 93264935.0
num_examples: 1880
- name: fewshot_5_bs_16
num_bytes: 94273956.0
num_examples: 1880
- name: fewshot_8_bs_16
num_bytes: 95791838.0
num_examples: 1880
download_size: 455214193
dataset_size: 467344183.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_1.3b_Attributes_Caption_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucainiao/MAESTRO_2004_SYNTH | ---
license: openrail
---
# MAESTRO-2004-SYNTH Dataset
This is a synthesized audio dataset using the midi of MAESTRO dataset [https://magenta.tensorflow.org/datasets/maestro].
Audio files are batch-synthesized in REAPER DAW [https://www.reaper.fm/] using superposition of triangle waves, square waves and sinusoid waves. |
katsuzin/SatoruGojoBR | ---
license: openrail
---
|
gadkins/who-covid-19-epidemiological-update-edition-163 | ---
task_categories:
- text-generation
language:
- en
tags:
- fine-tuning
- touch rugby
size_categories:
- n<1K
---
# World Health Organization (WHO) Epidemiological Update - Edition 163 (for embeddings)
train.pnf is taken from the [WHO website](https://www.who.int/publications/m/item/covid-19-epidemiological-update---19-january-2024)
test.csv was generated by GPT-3.5-turbo
All text is chunked to a length of 500 tokens with 10% overlap. |
fabraz/writingPromptAug | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- writingPrompt
dataset_info:
- config_name: 0.0.1
features:
- name: splitLineIndex
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_examples: 13232
- name: valid
num_examples: 5168
- name: test
num_examples: 5700
---
# Dataset Card for Writing Prompt Augmentation Dataset
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:** Writing Prompt Augmentation Dataset was built to expand samples of [FAIR Writing Prompt Dataset](https://www.kaggle.com/datasets/ratthachat/writing-prompts)
- **Leaderboard:**
- **Point of Contact:** [Fabricio Braz](mailto:fabraz@unb.br)
### Dataset Summary
Writing Prompt Augmentation Dataset was built to expand samples of [FAIR Writing Prompt Dataset](https://www.kaggle.com/datasets/ratthachat/writing-prompts), for feeding [Open Assistant](https://github.com/LAION-AI/Open-Assistant).
### Languages
English
## Dataset Structure
### Data Instances
```javascript
{"splitLineIndex":36888,
"text":"User: write me a story about: Most responses on here have a twist , and all of them are fictional . Show us a piece of your actual life ; let the reader experience you ., where the middle of the story is about: The boss sits on my wall and asks me where a urgent report is up. I stare at the screen, confused. -> Rosey: Sure, here\'s a story about: Most responses on here have a twist , and all of them are fictional . Show us a piece of your actual life ; let the reader experience you ., where the middle of the story is about: The boss sits on my wall and asks me where a urgent report is up. I stare at the screen, confused.:\\nMy bedroom was dark , but the chatter of distant birds outside of the heavily curtained window suggested a new day was dawning . I stayed in bed , one with the shadows , and stared at the indistinct blackness that filled the ceiling . A trickle of sickly grey light would soon spring from the edges of curtain - the heavy fabric could only do so much to dam the brightness of day . \\n \\n But , for a few more minutes , it was still dark . \\n \\n And I was staring at it - staring at the darkness . Examining it . \\n \\n What was I looking for ? Why , answers of course . Why does anyone stare into the darkness of a bedroom ceiling ? I was seeking answers . \\n \\n Why do it ? I asked myself . Why go to work ? You \'re good at your job when you can be bothered to do it , but how often does that happen ? How often do you really put the effort in ? \\n \\n Can you even remember enjoying it ? \\n \\n Can you remember when you were happy ? \\n \\n I had been too deep in my hunt for answers to notice that the homogenous darkness had given way to a bluish grey world of shapes and objects . My feet swung out of bed and I sat up in the early morning coldness . \\n \\n When *was* I happy last ? \\n \\n I stood up and started my day . \\n \\n * * * \\n \\n The kitchen was filling with light , the muted greys and blues of morning had arrived first , but each minute that passed promised the arrival of the full colours of day . \\n \\n The spoon clinked in the bowl as I scooped up some cereal . I wore only what I had to bed : boxer shirts and a t-shirt . The winter cold does n\'t bother you when you \'ve stopped caring . \\n \\n *When* was I happy ? \\n \\n The question was echoing in my head . A great puzzle . A mystery of the ages . \\n \\n I gulped the last of my morning coffee and went to the bathroom . \\n \\n * * * \\n \\n The plug hole held no answers , no matter how long I stared . \\n \\n How long had I been staring ? \\n \\n I turned the shower off and stepped out into the sterile tiled whiteness . A lifetime of habits drew me to the basin and , without thought , I started to brush my teeth . My mind was still locked , frozen , on the question . \\n \\n When was I happy ? \\n \\n As I wondered , day continued it \'s steady march outside . \\n \\n The bathroom was clean and white , morning light filtered in through a frosted window . The birds were loud now , but I could hardly hear them over the whir of the steam sucking fan above me . \\n \\n Day had officially arrived . \\n \\n Perhaps I am asking myself the wrong question , I thought . \\n \\n The man in the mirror bared his teeth and scrubbed some more , white foam dripped in blobs about the basin . \\n \\n *What* makes me happy ? \\n \\n * * * \\n \\n I had slipped into my work clothes : business shirt , dress pants , leather shoes . My prisoners garb . As I pulled the items on they weighed me down , each a colossal burden . At least I did n\'t wear a tie any more . \\n \\n I had given up on ties , and the rest of my uniform wore the scars of neglect : the shirt was unironed , the pants were thin at the knees and the stitching had come loose at the bottoms , the shoes were beaten , scratched , the soles and tops barely held their bond . \\n \\n This is the business attire of a man who has stopped caring . \\n \\n No one at work seemed to mind . \\n \\n I walked to the front door of my house , shuffling without enthusiasm , without joy for the new day that lay on the other side . \\n \\n I grabbed the handle . \\n \\n What makes me happy ? \\n \\n * * * \\n \\n Another request , another complaint , and my list of work grew longer . It only ever grew longer these days . I had important calls to make , issues to resolve , reports to write - but all I did , for the most part , was stare . \\n \\n Stare at my screen . At my hands . At nothing . \\n \\n The questions I had been asking in the darkness and through-out my house during my morning preparations were not new . I had been thinking on them for a while . I did not know for how long . \\n \\n Weeks ? No . Months . \\n \\n Still no answers . \\n \\n What I do know is : I am *not* happy . \\n \\n The boss leaned on my cubicle wall and asked me where an urgent report , a report that had been urgent for weeks , was up to . The bullshit I served sated his questions and as he walked away I sighed and stared at my screen . \\n \\n To my surprise the report was there . I had been working on it absent-mindedly . Try as I might I still did my job , at least to a degree . \\n \\n Manager for a division of one . Writer of reports and promiser of game changing applications . Mr IT . \\n \\n Well ... at one time I had been Mr IT . Once , when I had been passionate , had had a fire in my belly that churned the engine of my rising star . A career in IT . I had wanted this . \\n \\n Had n\'t I ? \\n \\n Then , why are n\'t I happy ? \\n \\n Because , you did n\'t want this . You never did . You stepped out of high school and fell into it . You \'re good with computers - at least , you were - but they never made you happy . You liked the challenge , sure , but you did it because you had to pay the bills and you had to leave your parents house at some point . \\n \\n Then it was a matter of you being lazy and gutless . Work is a hard habit to break , especially when people keep throwing money at you . You \'d just go in , day after day . Week after week . Month after ... \\n \\n School was almost a decade away and you have n\'t done half of what you wanted . Remember writing ? You were going to write , remember ? You \'ve done some shorts over the years , but you wanted more . You wanted to type those two words . After months and months , you \'d type those two words and you \'d have accomplished sonething . The End . And your book would be done - who cares if it got published . Who cares if no one but you ever saw it . \\n \\n You \'d have written something . You \'d have accomplished something . \\n \\n You \'d be ... \\n \\n And there it is . The answer . \\n \\n Ten years of wasted time - ten years of excuses and meeting other people \'s expectations . Ten years of syaing you \'ll get around to it . \\n \\n Ten years of regret . \\n \\n The report was done . So was I . \\n \\n How do I do this ? Do I walk in and hand in the report and a resignation . No . I ca n\'t do that . These people have been good to me . I need to finish up some of the jobs . Need to get them ready for my abscence . \\n \\n Or am I making excuses ? \\n \\n My screen and my work came into focus . I knew what I needed to do , could feel , almost by instinct , what job \'s were my biggest priorities . A spark lit in my gut and passion trickled through my veins . \\n \\n I was n\'t turning back into Mr IT - could in fact , never be that man again . \\n \\n But I knew what made me happy . Knew how to get there ... \\n \\n ... and could feel it there , just on my horizon ."}
```
### Data Fields
* splitLineIndex: refers to the index line of the data source.
* text: refers to the actual prompt/story text
### Data Splits
|split|samples|
|--|--
|train| 13232|
|valid|5168|
|test| 5700|
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
As mentioned, this dataset is an extension of FAIR writing prompt dataset. The steps employed to create the dataset are in the jupyter notebook at files.
#### Who are the source language producers?
FAIR
### Personal and Sensitive Information
The data comes with NSFW samples. Be aware!
## Additional Information
### Licensing Information
Writing Prompt Augmentation Dataset is licensed under MIT.
### Citation Information
Use to generate consistent stories by Hierarchical Neural Story Generation (Fan et al., 2018) https://arxiv.org/abs/1805.04833
### Contributions
Thanks to Huu Nguyen (gh:ontocord)! |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-36c277-93197145790 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: ainize/bart-base-cnn
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ainize/bart-base-cnn
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sasha](https://huggingface.co/sasha) for evaluating this model. |
tiagoblima/qg_squad_v1_pt | ---
dataset_info:
features:
- name: question
dtype: string
- name: paragraph_id
dtype: string
- name: paragraph
dtype: string
- name: answer
dtype: string
- name: paragraph_question
dtype: string
- name: paragraph_answer
dtype: string
- name: sentence
dtype: string
- name: answer_sentence
dtype: string
- name: paragraph_sentence
dtype: string
splits:
- name: train
num_bytes: 200934269
num_examples: 51704
- name: validation
num_bytes: 12651991.866603445
num_examples: 3164
- name: test
num_bytes: 12647993.133396555
num_examples: 3163
download_size: 74014585
dataset_size: 226234254.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
lordsymbol/zeumusic | ---
license: openrail
---
|
Salexoid/ru-messages-summarization | ---
license: mit
dataset_info:
features:
- name: message
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 347116
num_examples: 857
download_size: 88234
dataset_size: 347116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
toanduc/t5-sumary-dataset | ---
language:
- vi
--- |
azugarini/clue-instruct | ---
dataset_info:
features:
- name: keyword
dtype: string
- name: context
dtype: string
- name: category
dtype: string
- name: url
dtype: string
- name: clues
dtype: string
splits:
- name: train
num_bytes: 66439591
num_examples: 43475
- name: test
num_bytes: 917280
num_examples: 600
download_size: 41952418
dataset_size: 67356871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Clue-Instruct
Clue-Instruct is a synthetic dataset for Educational Crossword Puzzles.
Context-keyword pairs are associated with three distinct crossword clues, automatically generated.
## Pre-print
You can find the pre-print [here](https://arxiv.org/abs/2404.06186).
## License
Attribution-NonCommercial 4.0 International |
zahraa/AgriNet | ---
license: cc-by-nc-nd-3.0
---
|
victoriadreis/TuPY_dataset_binary | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- pt
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: TuPy
language_bcp47:
- pt-BR
tags:
- hate-speech-detection
configs:
- config_name: binary
data_files:
- split: full
path: tupy_binary_vote.csv
---
# Portuguese Hate Speech Dataset (TuPy)
The Portuguese hate speech dataset (TuPy) is an annotated corpus designed to facilitate the development of advanced hate speech detection models using machine learning (ML) and natural language processing (NLP) techniques. TuPy is formed by 10000 thousand unpublished annotated tweets collected in 2023.
This repository is organized as follows:
```sh
root.
├── annotations : classification given by annotators
├── raw corpus : dataset before being split between annotators
├── tupy datasets : combined result of annotations
└── README.md
```
## Voting process
To generate the binary matrices, we employed a straightforward voting process. Three distinct evaluations were assigned to each document. In cases where a document received two or more identical classifications, the adopted value is set to 1; otherwise, it is marked as 0.
## Acknowledge
The TuPy project is the result of the development of Felipe Oliveira's thesis and the work of several collaborators. This project is financed by the Federal University of Rio de Janeiro ([UFRJ](https://ufrj.br/)) and the Alberto Luiz Coimbra Institute for Postgraduate Studies and Research in Engineering ([COPPE](https://coppe.ufrj.br/)). |
GalaktischeGurke/invoices_instruct_vf_weird | ---
dataset_info:
features:
- name: ground_truth
dtype: string
- name: response
dtype: string
- name: instruction
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2720389
num_examples: 501
download_size: 1109588
dataset_size: 2720389
---
# Dataset Card for "invoices_instruct_vf_weird"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Geonmo/deepfashion-multimodal-descriptions-split | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 939822
num_examples: 11730
download_size: 247226
dataset_size: 939822
---
# Dataset Card for "deepfashion-multimodal-descriptions-split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jan-hq__komodo-7b-chat | ---
pretty_name: Evaluation run of jan-hq/komodo-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/komodo-7b-chat](https://huggingface.co/jan-hq/komodo-7b-chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__komodo-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T21:53:53.571871](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__komodo-7b-chat/blob/main/results_2024-03-24T21-53-53.571871.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4470676396888139,\n\
\ \"acc_stderr\": 0.03447744759910176,\n \"acc_norm\": 0.4529162350771818,\n\
\ \"acc_norm_stderr\": 0.03532579988060745,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.4004786701634766,\n\
\ \"mc2_stderr\": 0.013918547046172366\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.0145853058400071,\n\
\ \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370053\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5711013742282414,\n\
\ \"acc_stderr\": 0.004939073014754942,\n \"acc_norm\": 0.7704640509858594,\n\
\ \"acc_norm_stderr\": 0.00419674964838538\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534422,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534422\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730564,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.44193548387096776,\n \"acc_stderr\": 0.028251557906849745,\n \"\
acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.028251557906849745\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736125,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5853211009174312,\n \"acc_stderr\": 0.021122903208602585,\n \"\
acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.021122903208602585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.03508637358630572,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6452991452991453,\n\
\ \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.6452991452991453,\n\
\ \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5836526181353767,\n\
\ \"acc_stderr\": 0.017627948030430298,\n \"acc_norm\": 0.5836526181353767,\n\
\ \"acc_norm_stderr\": 0.017627948030430298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.02684298551961537,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.02684298551961537\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925295,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.028173917761762896,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.028173917761762896\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02812163604063988,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02812163604063988\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n\
\ \"acc_stderr\": 0.0121981406053536,\n \"acc_norm\": 0.3520208604954368,\n\
\ \"acc_norm_stderr\": 0.0121981406053536\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44281045751633985,\n \"acc_stderr\": 0.02009508315457735,\n \
\ \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.02009508315457735\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.03743979825926399,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.03743979825926399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.4004786701634766,\n\
\ \"mc2_stderr\": 0.013918547046172366\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04852160727824109,\n \
\ \"acc_stderr\": 0.00591846861892107\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/komodo-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|arc:challenge|25_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|gsm8k|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hellaswag|10_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-53-53.571871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T21-53-53.571871.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- '**/details_harness|winogrande|5_2024-03-24T21-53-53.571871.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T21-53-53.571871.parquet'
- config_name: results
data_files:
- split: 2024_03_24T21_53_53.571871
path:
- results_2024-03-24T21-53-53.571871.parquet
- split: latest
path:
- results_2024-03-24T21-53-53.571871.parquet
---
# Dataset Card for Evaluation run of jan-hq/komodo-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/komodo-7b-chat](https://huggingface.co/jan-hq/komodo-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__komodo-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T21:53:53.571871](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__komodo-7b-chat/blob/main/results_2024-03-24T21-53-53.571871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4470676396888139,
"acc_stderr": 0.03447744759910176,
"acc_norm": 0.4529162350771818,
"acc_norm_stderr": 0.03532579988060745,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.4004786701634766,
"mc2_stderr": 0.013918547046172366
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.0145853058400071,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370053
},
"harness|hellaswag|10": {
"acc": 0.5711013742282414,
"acc_stderr": 0.004939073014754942,
"acc_norm": 0.7704640509858594,
"acc_norm_stderr": 0.00419674964838538
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.030976692998534422,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.030976692998534422
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730564,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.028251557906849745,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.028251557906849745
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736125,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.021122903208602585,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.021122903208602585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6452991452991453,
"acc_stderr": 0.03134250486245402,
"acc_norm": 0.6452991452991453,
"acc_norm_stderr": 0.03134250486245402
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5836526181353767,
"acc_stderr": 0.017627948030430298,
"acc_norm": 0.5836526181353767,
"acc_norm_stderr": 0.017627948030430298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.02684298551961537,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.02684298551961537
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925295,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.028173917761762896,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.028173917761762896
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02812163604063988,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02812163604063988
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.0121981406053536,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.0121981406053536
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44281045751633985,
"acc_stderr": 0.02009508315457735,
"acc_norm": 0.44281045751633985,
"acc_norm_stderr": 0.02009508315457735
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.03743979825926399,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.03743979825926399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.4004786701634766,
"mc2_stderr": 0.013918547046172366
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
},
"harness|gsm8k|5": {
"acc": 0.04852160727824109,
"acc_stderr": 0.00591846861892107
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/hata_tan_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hata_tan/はたたん (Touhou)
This is the dataset of hata_tan/はたたん (Touhou), containing 90 images and their tags.
The core tags of this character are `twintails, long_hair, hat, tokin_hat, purple_eyes, black_hair, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 68.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 90 | 54.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 181 | 102.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 90 | 66.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 181 | 118.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_tan_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hata_tan_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bags_under_eyes, necktie, solo, hair_ribbon |
| 1 | 12 |  |  |  |  |  | 1girl, necktie, solo, cellphone, checkered_skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bags_under_eyes | necktie | solo | hair_ribbon | cellphone | checkered_skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:----------|:-------|:--------------|:------------|:------------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | | |
| 1 | 12 |  |  |  |  |  | X | | X | X | | X | X |
|
perceptron-743/good-reads-data | ---
dataset_info:
features:
- name: bookId
dtype: string
- name: title
dtype: string
- name: series
dtype: string
- name: author
dtype: string
- name: rating
dtype: float64
- name: description
dtype: string
- name: language
dtype: string
- name: isbn
dtype: string
- name: genres
dtype: string
- name: characters
dtype: string
- name: bookFormat
dtype: string
- name: edition
dtype: string
- name: pages
dtype: string
- name: publisher
dtype: string
- name: publishDate
dtype: string
- name: firstPublishDate
dtype: string
- name: awards
dtype: string
- name: numRatings
dtype: int64
- name: ratingsByStars
dtype: string
- name: likedPercent
dtype: float64
- name: setting
dtype: string
- name: coverImg
dtype: string
- name: bbeScore
dtype: int64
- name: bbeVotes
dtype: int64
- name: price
dtype: string
- name: word_count
dtype: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 182353702
num_examples: 53944
download_size: 57947357
dataset_size: 182353702
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/79aa8f65 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 174
num_examples: 10
download_size: 1329
dataset_size: 174
---
# Dataset Card for "79aa8f65"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yukimura_aoi_encouragementofclimb | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yukimura Aoi
This is the dataset of Yukimura Aoi, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 750 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 889 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 750 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 750 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 643 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 889 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 889 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
judy93536/pharsebank_5k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 542147
num_examples: 3999
- name: test
num_bytes: 137048
num_examples: 999
download_size: 379517
dataset_size: 679195
---
# Dataset Card for "pharsebank_5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_last_sent_train_10_eval_10_sentbefore | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 68765
num_examples: 50
- name: validation
num_bytes: 8980
num_examples: 10
download_size: 52757
dataset_size: 77745
---
# Dataset Card for "find_last_sent_train_10_eval_10_sentbefore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PerceptionEval/Localization | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: image_1
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: val
num_bytes: 15508691.0
num_examples: 122
- name: test
num_bytes: 15908670.0
num_examples: 125
download_size: 31334083
dataset_size: 31417361.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
lalphass/midjourney_prompt_dataset_created_with_bard | ---
license: apache-2.0
---
|
SirNeural/flan_v2 | ---
license: apache-2.0
tags:
- flan
- flan 2022
- flan v2
pretty_name: Flan v2
---
# Dataset Card for Flan V2
## Dataset Description
- **Homepage:** https://ai.googleblog.com/2023/02/the-flan-collection-advancing-open.html
- **Repository:** https://github.com/google-research/FLAN/tree/main/flan/v2
- **Paper:** https://arxiv.org/abs/2301.13688
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a processed version of the Flan V2 dataset.
I'm not affiliated with the creators, I'm just releasing the files in an easier-to-access format after processing.
The authors of the Flan Collection recommend experimenting with different mixing ratio's of tasks to get optimal results downstream.
## Setup Instructions
Here are the steps I followed to get everything working:
### Build AESLC and WinoGrande datasets manually
The repos for these datasets were updated recently and checksums need to be recomputed in TFDS
- `tfds build --dataset aeslc --register_checksums`
- `tfds build --dataset winogrande --register_checksums`
### Fix dataset versions
I've opened a PR [here](https://github.com/google-research/FLAN/pull/20) to get these updated in the upstream FLAN repo, until that gets merged in run these locally to fix any dataset version errors.
- `sed -i 's/glue\/cola:1.0.0/glue\/cola:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/gem\/common_gen:1.0.0/gem\/common_gen:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/gem\/dart:1.0.0/gem\/dart:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/gem\/e2e_nlg:1.0.0/gem\/e2e_nlg:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/gem\/web_nlg_en:1.0.0/gem\/web_nlg_en:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/gem\/common_gen:1.0.0/gem\/common_gen:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/paws_wiki:1.0.0/paws_wiki:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/mrpc:1.0.0/glue\/mrpc:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/qqp:1.0.0/glue\/qqp:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/sst2:1.0.0/glue\/sst2:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/mnli:1.0.0/glue\/mnli:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/qnli:1.0.0/glue\/qnli:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/wnli:1.0.0/glue\/wnli:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/glue\/stsb:1.0.0/glue\/stsb:2.0.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/hellaswag:0.0.1/hellaswag:1.1.0/g' flan/v2/task_configs_v1.py`
- `sed -i 's/xsum:1.0.0/huggingface:xsum/g' flan/v2/task_configs_v1.py`
### Download and install manual steps
Save these to `~/tensorflow_datasets/downloads/manual`.
- [CzEng (deduped ignoring sections)](https://ufal.mff.cuni.cz/czeng/czeng16pre)
- [Newsroom (extract)](https://lil.nlp.cornell.edu/newsroom/download/index.html)
- [Yandex 1M Corpus](https://translate.yandex.ru/corpus?lang=en)
- [Story Cloze (extract and rename to cloze_test_test__spring2016.csv and cloze_test_val__spring2016.csv)](https://cs.rochester.edu/nlp/)
### Finally, export tasks
```python
import tensorflow as tf
tf.config.set_visible_devices([], 'GPU')
from flan.v2 import constants
from flan.v2 import constants_t0
from flan.v2 import mixtures_utils
from flan.v2 import mixtures
from flan.v2 import tasks
import json
import t5
import seqio
import itertools
from multiprocessing import Pool
seqio.add_global_cache_dirs(constants.CACHE_DIRS)
seqio.set_global_cache_dirs(constants.CACHE_DIRS)
vocab = t5.data.get_default_vocabulary()
def prepare_task(split, shots, opt, task):
dataset = seqio.get_mixture_or_task(f'palmflan_{task}_{shots}_{opt}').get_dataset(
split=split,
num_epochs=1,
sequence_length={'inputs':4096,'targets':4096}
)
print("starting", task, shots, opt, split)
with open(f'./data/{task}_{shots}_{opt}_{split}.jsonl', 'w') as f:
for ex in dataset.as_numpy_iterator():
f.write(
json.dumps({
"inputs": vocab.decode(ex["inputs"]),
"targets": vocab.decode(ex["targets"]),
"task": task,
}))
f.write("\n")
print("done with", task, shots, opt, split)
# prepare_task("train", "zs", "noopt", "dialog") # use this to export a single task
tasks = itertools.product(["train"], ["zs", "fs"], ["opt", "noopt"], ["dialog", "t0", "niv2", "flan", "cot"])
with Pool(5) as p:
p.starmap(prepare_task, [(task[0], task[1], task[2], task[3]) for task in tasks])
```
## Dataset Structure
### Data Instances
Flan 2021 (flan), P3 (t0), Super-Natural Instructions (niv2), Chain-of-thought (cot), and Dialog (dialog)
### Data Fields
Instruction data comes in a few formats:
- Few Shot (fs)
- Zero Shot (zs)
- Options Provided in context (i.e. multiple choice pick one) (opt)
- No Options Provided (noopt)
Each combination of the above tasks + formats are saved as a JSONL with following schema `{"input": ..., "target": ..., "task": ...}`
### Data Splits
Everything is saved as a train split
Note: FLAN-fs-opt-train is too big to be uploaded even when gzipped, so its split into 45gb chunks. To combine and recover, run `cat flan_fs_opt_train_*.gz | gunzip -c > flan_fs_opt_train.jsonl`
|
allegro/klej-cdsc-e-en | ---
license: apache-2.0
task_categories:
- text-classification
language:
- pl
- en
pretty_name: CDSC-E translated to English
size_categories:
- n<1K
---
All instances from the `allegro/klej-cdsc-e` (train, val, test) translated to English with Google Translate API.
Columns:
- `source` - text instance in Polish.
- `target` - text instance in English. |
ai4bharat/indic-align | ---
license: cc-by-4.0
pretty_name: indic-align
language:
- as
- bn
- gu
- en
- hi
- kn
- ml
- mr
- ne
- or
- pa
- sa
- ta
- te
- ur
task_categories:
- text-generation
dataset_info:
- config_name: Indic_ShareLlama
- config_name: Dolly_T
- config_name: OpenAssistant_T
- config_name: WikiHow
- config_name: IndoWordNet
- config_name: Anudesh
- config_name: Wiki_Conv
- config_name: Wiki_Chat
- config_name: IndicAlign-Toxic
- config_name: HHRLHF_T
- config_name: Toxic_Matrix
configs:
- config_name: Indic_ShareLlama
data_files: indicalign-instruct/indicsharellama/*
- config_name: Dolly_T
data_files: indicalign-instruct/dolly/*
- config_name: OpenAssistant_T
data_files: indicalign-instruct/oasst/*
- config_name: WikiHow
data_files: indicalign-instruct/wikihow/*
- config_name: IndoWordNet
data_files: indicalign-instruct/indowordnet/*
- config_name: Anudesh
data_files: indicalign-instruct/anudesh/*
- config_name: Wiki_Conv
data_files: indicalign-instruct/wiki_conv/*
- config_name: Wiki_Chat
data_files: indicalign-instruct/wiki_chat/*
- config_name: HHRLHF_T
data_files: indicalign-toxic/hhrlhf/*
- config_name: Toxic_Matrix
data_files: indicalign-toxic/toxicmatrix/*
size_categories:
- 100M<n<1B
---
# IndicAlign
A diverse collection of Instruction and Toxic alignment datasets for 14 Indic Languages. The collection comprises of:
- **IndicAlign - Instruct**
- Indic-ShareLlama
- Dolly-T
- OpenAssistant-T
- WikiHow
- IndoWordNet
- Anudesh
- Wiki-Conv
- Wiki-Chat
- **IndicAlign - Toxic**
- HHRLHF-T
- Toxic-Matrix
We use IndicTrans2 ([Gala et al., 2023](https://openreview.net/forum?id=vfT4YuzAYA)) for the translation of the datasets.
We recommend the readers to check out our paper [on Arxiv](https://arxiv.org/abs/2403.06350) for detailed information on the curation process of these collections.
## Dataset Summaries
**IndicShareLlama**- Collection of first user prompts from [ShareGPT](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered) along with responses from [Llama2-70B-Chat](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf) model.
**Dolly-T**- Translated and Romanised version of [Dolly-15K](https://huggingface.co/datasets/databricks/databricks-dolly-15k)
**OpenAssistant-T**- Translated and Romanised version of [OpenAssistant v1](https://huggingface.co/datasets/OpenAssistant/oasst1)
**WikiHow** - Translated and Romanised version of [WikiHow](https://huggingface.co/datasets/ai4bharat/indic-instruct-data-v0.1)
**IndoWordNet**- Novel dataset created by converting the entried of [IndoWordNet](https://pypi.org/project/pyiwn/) to Instruction-Response pairs in 18 Indic languages.
**Anudesh**- A crowd-sourced collection of prompts accompanied by responses generated from the Llama2-70B-Chat model.
**Wiki-Conv**- Collection of short, to-the-point conversations on Wikipedia passages and Wiki-Infoboxes created using Llama2-70B-Chat model.
**Wiki-Chat**- Collection of long, open conversations on Wikipedia passages, created by simulating conversations between a User and an Assistant models.
**HHRLHF-T**- Collection of "toxic" prompts from [Anthropic HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf) with refusals from Llama2-70B-Chat model.
**Toxic-Matrix**- A novel "synthetic" dataset with toxic prompts generated using [Mistral-7B Instruct](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) and non-toxic responses/refusals using Llama2-70B-Chat model.
## Dataset Statistics
| Component | #Examples | Avg. Turns | Avg. Inst. Len | Avg. Out. Len |
|-------------------|-----------|------------|----------------|---------------|
| Indic ShareLlama | 21.1k | 1 | 60.45 | 267.98 |
| Dolly-T | 15.0k | 1 | 12.34 | 59.38 |
| OpenAssistant-T | 19.9k | 2.98 | 25.72 | 136.37 |
| WikiHow | 20.3k | 1 | 43.85 | 327.95 |
| IndoWordNet | 74,272.2k | 1 | 19.74 | 14.84 |
| Anudesh | 36.8k | 1.58 | 12.4 | 149.28 |
| Wiki-Conv | 144k | 9.14 | 7.09 | 11.22 |
| Wiki-Chat | 202k | 2.8 | 23 | 227.75 |
| HH-RLHF-T | 32.6k | 1 | 14.11 | 64.88 |
| Toxic Matrix | 90.3k | 1 | 33.68 | 89.64 |
## Citation
```bibtex
@misc{khan2024indicllmsuite,
title={IndicLLMSuite: A Blueprint for Creating Pre-training and Fine-Tuning Datasets for Indian Languages},
author={Mohammed Safi Ur Rahman Khan and Priyam Mehta and Ananth Sankar and Umashankar Kumaravelan and Sumanth Doddapaneni and Suriyaprasaad G and Varun Balan G and Sparsh Jain and Anoop Kunchukuttan and Pratyush Kumar and Raj Dabre and Mitesh M. Khapra},
year={2024},
eprint={2403.06350},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
ibranze/araproje_arc_en_f4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 80031.0
num_examples: 250
download_size: 0
dataset_size: 80031.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_f4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yerkekz/top_k_dataset | ---
dataset_info:
features:
- name: note
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 3937855
num_examples: 1000
download_size: 1975731
dataset_size: 3937855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cmu-mlsp/hubert_layer9-librispeech-asr100h_tokenized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1337768164
num_examples: 57078
- name: validation
num_bytes: 126705828
num_examples: 5406
- name: test
num_bytes: 122815120
num_examples: 5240
download_size: 110156012
dataset_size: 1587289112
---
# Dataset Card for "hubert_layer9-librispeech-asr100h_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
doof-ferb/vlsp2020_vinai_100h | ---
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
- text-to-speech
language:
- vi
pretty_name: VLSP 2020 - VinAI - ASR challenge dataset
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 17159347574.893
num_examples: 56427
download_size: 11649243045
dataset_size: 17159347574.893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# unofficial mirror of VLSP 2020 - VinAI - ASR challenge dataset
official announcement:
- tiếng việt: https://institute.vinbigdata.org/events/vinbigdata-chia-se-100-gio-du-lieu-tieng-noi-cho-cong-dong/
- in eglish: https://institute.vinbigdata.org/en/events/vinbigdata-shares-100-hour-data-for-the-community/
- VLSP 2020 workshop: https://vlsp.org.vn/vlsp2020
official download: https://drive.google.com/file/d/1vUSxdORDxk-ePUt-bUVDahpoXiqKchMx/view?usp=sharing
contact: info@vinbigdata.org
100h, 56.4k samples, accuracy 96%
pre-process: merge all transcript text files into 1, remove token `<unk>`
need to do: check misspelling, restore foreign words phonetised to vietnamese
usage with HuggingFace:
```python
# pip install -q "datasets[audio]"
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset = load_dataset("doof-ferb/vlsp2020_vinai_100h", split="train", streaming=True)
dataset.set_format(type="torch", columns=["audio", "transcription"])
dataloader = DataLoader(dataset, batch_size=4)
``` |
vwxyzjn/cai-conversation-prod | ---
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: test
num_bytes: 15206093
num_examples: 8552
- name: train
num_bytes: 283519246
num_examples: 160800
download_size: 120521045
dataset_size: 298899650
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for "cai-conversation-prod"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/push_configs | ---
dataset_info:
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 42
num_examples: 3
download_size: 1190
dataset_size: 42
- config_name: v1
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 56
num_examples: 4
download_size: 1194
dataset_size: 56
- config_name: v2
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 70
num_examples: 5
download_size: 1199
dataset_size: 70
configs_kwargs:
- config_name: default
data_dir: data
- config_name: v1
data_dir: v1
- config_name: v2
data_dir: v2
---
# Dataset Card for "push_configs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rxm210132/test | ---
dataset_info:
features:
- name: texts
dtype: string
- name: labels
sequence: float64
splits:
- name: train
num_bytes: 624448
num_examples: 3259
download_size: 255733
dataset_size: 624448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HankQIAN/slr68 | ---
license: mit
---
|
musicakamusic/piano | ---
license: gpl-3.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_perfect_slam | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 233668
num_examples: 1217
- name: test
num_bytes: 2194042
num_examples: 11275
- name: train
num_bytes: 2076075
num_examples: 10705
download_size: 2693590
dataset_size: 4503785
---
# Dataset Card for "MULTI_VALUE_qqp_perfect_slam"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_shadowml__Beyonder-4x7B-v2 | ---
pretty_name: Evaluation run of mlabonne/Beyonder-4x7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Beyonder-4x7B-v2](https://huggingface.co/mlabonne/Beyonder-4x7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T13:00:16.346263](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v2/blob/main/results_2024-01-04T13-00-16.346263.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557407580878285,\n\
\ \"acc_stderr\": 0.031986495815639754,\n \"acc_norm\": 0.6553471404895377,\n\
\ \"acc_norm_stderr\": 0.03264904081955929,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.606846132898595,\n\
\ \"mc2_stderr\": 0.015656381105660862\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880541,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6960764787890859,\n\
\ \"acc_stderr\": 0.004590100050198816,\n \"acc_norm\": 0.8679545907189803,\n\
\ \"acc_norm_stderr\": 0.0033784824887488746\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136077,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136077\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268577,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268577\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.025218040373410633,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.025218040373410633\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.606846132898595,\n\
\ \"mc2_stderr\": 0.015656381105660862\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510423\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7172100075815011,\n \
\ \"acc_stderr\": 0.012405020417873619\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Beyonder-4x7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-00-16.346263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-00-16.346263.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- '**/details_harness|winogrande|5_2024-01-04T13-00-16.346263.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T13-00-16.346263.parquet'
- config_name: results
data_files:
- split: 2024_01_04T13_00_16.346263
path:
- results_2024-01-04T13-00-16.346263.parquet
- split: latest
path:
- results_2024-01-04T13-00-16.346263.parquet
---
# Dataset Card for Evaluation run of mlabonne/Beyonder-4x7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Beyonder-4x7B-v2](https://huggingface.co/mlabonne/Beyonder-4x7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:00:16.346263](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v2/blob/main/results_2024-01-04T13-00-16.346263.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557407580878285,
"acc_stderr": 0.031986495815639754,
"acc_norm": 0.6553471404895377,
"acc_norm_stderr": 0.03264904081955929,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.606846132898595,
"mc2_stderr": 0.015656381105660862
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880541,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.6960764787890859,
"acc_stderr": 0.004590100050198816,
"acc_norm": 0.8679545907189803,
"acc_norm_stderr": 0.0033784824887488746
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136077,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268577,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.025218040373410633,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.025218040373410633
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.606846132898595,
"mc2_stderr": 0.015656381105660862
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510423
},
"harness|gsm8k|5": {
"acc": 0.7172100075815011,
"acc_stderr": 0.012405020417873619
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FaalSa/cluster0 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 277387012
num_examples: 7031
- name: validation
num_bytes: 280761892
num_examples: 7031
- name: test
num_bytes: 284136772
num_examples: 7031
download_size: 38383722
dataset_size: 842285676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Kaina99/Matheus | ---
license: openrail
---
|
LeonardoTiger/seer | ---
license: openrail
---
|
Ankita802/llm | ---
dataset_info:
features:
- name: input
dtype: string
- name: result
dtype: string
splits:
- name: train
num_bytes: 1796067.5034013605
num_examples: 1999
- name: test
num_bytes: 449241.49659863947
num_examples: 500
download_size: 1138420
dataset_size: 2245309.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
SPRIGHT-T2I/100-images-for-eval | ---
dataset_info:
features:
- name: image
dtype: image
- name: spatial_caption
dtype: string
- name: subset
dtype: string
splits:
- name: train
num_bytes: 68967761
num_examples: 100
download_size: 68830514
dataset_size: 68967761
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
---
# Dataset Card for "100-images-for-eval"
100 images were selected with:
```python
import os
import shutil
import random
import json
from tqdm.auto import tqdm
random.seed(2024)
JSON_PATHS = ["cc12m/spatial_prompts_cc_res768.jsonl", "sa/spatial_prompts_sa_res768.jsonl"]
CUT_OFF_FOR_EACH = 50
SUBSET_DIR = "eval"
ROOT_PATH = "./human_eval_subset"
def copy_images(tuple_entries, subset):
final_dict = {}
for entry in tqdm(tuple_entries):
image_name = entry[0].split("/")[-1]
image_to_copy_from = os.path.join(ROOT_PATH, subset, "images", image_name)
image_to_copy_to = os.path.join(ROOT_PATH, SUBSET_DIR)
shutil.copy(image_to_copy_from, image_to_copy_to)
final_dict[image_name] = entry[1]
return final_dict
# Load the JSON files.
cc12m_entries = []
with open(JSON_PATHS[0], "rb") as json_list:
for json_str in json_list:
cc12m_entries.append(json.loads(json_str))
sa_entries = []
with open(JSON_PATHS[1], "rb") as json_list:
for json_str in json_list:
sa_entries.append(json.loads(json_str))
# Prepare tuples and shuffle them for random sampling.
print(len(cc12m_entries), len(sa_entries))
cc12m_tuples = [(line["file_name"], line["spatial_caption"]) for line in cc12m_entries]
sa_tuples = [(line["file_name"], line["spatial_caption"]) for line in sa_entries]
filtered_cc12m_tuples = [
(line[0], line[1])
for line in cc12m_tuples
if os.path.exists(os.path.join(ROOT_PATH, "cc12m", "images", line[0].split("/")[-1]))
]
# Keep paths that exist.
filtered_sa_tuples = [
(line[0], line[1])
for line in sa_tuples
if os.path.exists(os.path.join(ROOT_PATH, "sa", "images", line[0].split("/")[-1]))
]
print(len(filtered_cc12m_tuples), len(filtered_sa_tuples))
random.shuffle(filtered_cc12m_tuples)
random.shuffle(filtered_sa_tuples)
# Cut off for subsets.
subset_cc12m_tuples = filtered_cc12m_tuples[:CUT_OFF_FOR_EACH]
subset_sa_tuples = filtered_sa_tuples[:CUT_OFF_FOR_EACH]
# Copy over the images.
if not os.path.exists(SUBSET_DIR):
os.makedirs(SUBSET_DIR, exist_ok=True)
final_data_dict = {}
cc12m_dict = copy_images(subset_cc12m_tuples, "cc12m")
sa_dict = copy_images(subset_sa_tuples, "sa")
print(len(cc12m_dict), len(sa_dict))
final_data_dict = {**cc12m_dict, **sa_dict}
# Create a json file to record metadata.
with open("final_data_dict.json", "w") as f:
json.dump(final_data_dict, f)
```
After the JSON file was obtained, the dataset was created with:
```python
from datasets import Dataset, Features
from datasets import Image as ImageFeature
from datasets import Value
import json
import os
final_dict_path = "final_data_dict.json"
with open(final_dict_path, "r") as f:
final_dict = json.load(f)
root_path = "./human_eval_subset/eval"
def generation_fn():
for k in final_dict:
yield {
"image": os.path.join(root_path, k),
"spatial_caption": final_dict[k],
"subset": "SA" if "sa" in k else "CC12M",
}
ds = Dataset.from_generator(
generation_fn,
features=Features(
image=ImageFeature(),
spatial_caption=Value("string"),
subset=Value("string"),
),
)
ds_id = "ASU-HF/100-images-for-eval"
ds.push_to_hub(ds_id)
``` |
TuringsSolutions/HumorDatasetSmall | ---
license: mit
---
|
Hack90/europe_pmc_articles_part_1 | ---
dataset_info:
features:
- name: article_title
dtype: string
- name: abstract
dtype: string
- name: contributor
dtype: string
- name: journal
dtype: string
- name: full_text
dtype: string
splits:
- name: train
num_bytes: 32330800003
num_examples: 900871
download_size: 12877178347
dataset_size: 32330800003
---
# Dataset Card for "europe_pmc_articles_part_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
QuickWire/TestSet-CB | ---
license: mit
---
|
Yuhthe/Bactrian-X_vi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: id
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 63122796
num_examples: 67017
download_size: 32712623
dataset_size: 63122796
---
# Dataset Card for "Bactrian-X_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hieu-Pham/cpgQA | ---
license: mit
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-49000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1070070
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/lakhesh_sukasuka | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lakhesh Nyx Seniorious/ラキシュ・ニクス・セニオリス (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?)
This is the dataset of Lakhesh Nyx Seniorious/ラキシュ・ニクス・セニオリス (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?), containing 69 images and their tags.
The core tags of this character are `short_hair, brown_eyes, blonde_hair, side_ponytail, hair_ribbon, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 32.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakhesh_sukasuka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 69 | 32.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakhesh_sukasuka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 117 | 54.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakhesh_sukasuka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lakhesh_sukasuka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 2girls, brown_hair, pink_hair, long_hair, one_side_up, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, solo, turtleneck, upper_body, blush, profile, shirt, dress, from_side, indoors, blurry_background |
| 2 | 8 |  |  |  |  |  | 1girl, brown_dress, holding_bag, long_sleeves, pinafore_dress, shirt, standing, open_mouth, solo, :o, turtleneck, door, blurry_background, one_side_up, v_arms |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 2girls | brown_hair | pink_hair | long_hair | one_side_up | open_mouth | 1girl | closed_mouth | long_sleeves | solo | turtleneck | upper_body | blush | profile | shirt | dress | from_side | indoors | blurry_background | brown_dress | holding_bag | pinafore_dress | standing | :o | door | v_arms |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------|:-------------|:------------|:------------|:--------------|:-------------|:--------|:---------------|:---------------|:-------|:-------------|:-------------|:--------|:----------|:--------|:--------|:------------|:----------|:--------------------|:--------------|:--------------|:-----------------|:-----------|:-----|:-------|:---------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | | | X | X | X | | X | X | X | | | | X | | | | X | X | X | X | X | X | X | X |
|
mulderyu/MOENV_IoT_PM2.5 | ---
license: apache-2.0
---
|
ENERGY-DRINK-LOVE/msmarco_ko | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: query
dtype: string
- name: positive_passages
list:
- name: docid
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: negative_passages
list:
- name: docid
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: query_ko
dtype: string
splits:
- name: train
num_bytes: 2930954584.4970384
num_examples: 359918
download_size: 1572094220
dataset_size: 2930954584.4970384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pteromyini0/24_2_Gjoni | ---
dataset_info:
config_name: sq
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: normalized_text
dtype: string
- name: audio_start_sec
dtype: float64
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 3555652.0
num_examples: 24
download_size: 3363176
dataset_size: 3555652.0
configs:
- config_name: sq
data_files:
- split: train
path: sq/train-*
---
|
itamarcard/salmo | ---
license: openrail
---
|
pbaoo2705/biomedqa_processed_eval | ---
dataset_info:
features:
- name: 'Unnamed: 0.1'
dtype: int64
- name: 'Unnamed: 0'
dtype: int64
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: answer
dtype: string
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 347583
num_examples: 100
download_size: 124060
dataset_size: 347583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biomedqa_processed_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pythainlp/han-instruct-dataset-v2.0 | ---
dataset_info:
features:
- name: q
dtype: string
- name: a
dtype: string
splits:
- name: train
num_bytes: 2441555
num_examples: 3200
download_size: 998771
dataset_size: 2441555
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-4.0
task_categories:
- text-generation
language:
- th
size_categories:
- 1K<n<10K
---
# Dataset Card for Han Instruct Dataset v2.0
🪿 Han (ห่าน or goose) Instruct Dataset is a Thai instruction dataset by PyThaiNLP. This dataset collect all Thai instruct dataset that made by human and our old model. The dataset can use to train Instruction Following model like ChatGPT or other.
Many question are collect from [Reference desk at Thai wikipedia](https://th.wikipedia.org/wiki/%E0%B8%A7%E0%B8%B4%E0%B8%81%E0%B8%B4%E0%B8%9E%E0%B8%B5%E0%B9%80%E0%B8%94%E0%B8%B5%E0%B8%A2:%E0%B8%9B%E0%B8%B8%E0%B8%88%E0%B8%89%E0%B8%B2-%E0%B8%A7%E0%B8%B4%E0%B8%AA%E0%B8%B1%E0%B8%8A%E0%B8%99%E0%B8%B2).
Data sources:
- [Reference desk at Thai wikipedia](https://th.wikipedia.org/wiki/%E0%B8%A7%E0%B8%B4%E0%B8%81%E0%B8%B4%E0%B8%9E%E0%B8%B5%E0%B9%80%E0%B8%94%E0%B8%B5%E0%B8%A2:%E0%B8%9B%E0%B8%B8%E0%B8%88%E0%B8%89%E0%B8%B2-%E0%B8%A7%E0%B8%B4%E0%B8%AA%E0%B8%B1%E0%B8%8A%E0%B8%99%E0%B8%B2).
- [Law from justicechannel.org](https://justicechannel.org/)
- [pythainlp/final_training_set_v1_enth](https://huggingface.co/datasets/pythainlp/final_training_set_v1_enth): Human checked and edited.
- Self-instruct from [WangChanGLM](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en)
- [Wannaphong.com](https://www.wannaphong.com)
- [Blognone](https://www.blognone.com)
- Human annotators
### Supported Tasks and Leaderboards
- ChatBot
- Instruction Following
### Languages
Thai
## Dataset Structure
### Data Fields
- q: Question
- a: Answer
### Considerations for Using the Data
The dataset can has a biased from human annotators. We recommend you should checked the dataset to select or remove an instruct follow before train the model or use it own your risk.
### Licensing Information
CC-BY-SA 4.0
### Citation
If you use `Han Instruct Dataset (v2.0)` in your project or publication, please cite the dataset as follows:
> Phatthiyaphaibun, W. (2024). Han Instruct Dataset (v2.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.10935857
or
```bib
@dataset{phatthiyaphaibun_2024_10935857,
author = {Phatthiyaphaibun, Wannaphong},
title = {Han Instruct Dataset},
month = apr,
year = 2024,
publisher = {Zenodo},
version = {v2.0},
doi = {10.5281/zenodo.10935857},
url = {https://doi.org/10.5281/zenodo.10935857}
}
```
Zenodo: [https://doi.org/10.5281/zenodo.10935857](https://doi.org/10.5281/zenodo.10935857) |
Emmawang/newsdataset | ---
license: bsd
---
|
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_350m_VQAv2_visclues_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_8
num_bytes: 25491823
num_examples: 1000
download_size: 4914865
dataset_size: 25491823
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_350m_VQAv2_visclues_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtarasovic/ner-rent-sk-dataset | ---
license: apache-2.0
---
|
eastwind/semeval-2016-absa-reviews-english-translated-resampled | ---
license: mit
---
# Dataset Card for Hotel Review ABSA (SemEval 2016 Translated from Arabic)
## Dataset Description
Derived from eastwind/semeval-2016-absa-reviews-english-translated-stanford-alpaca, by upsampling the neutral class and then resampling 3k examples from each class |
Sunbird/Experimental-Speech-Salt-Luganda-16k | ---
dataset_info:
features:
- name: audio
sequence:
sequence: float32
- name: sample_rate
dtype: int64
- name: transcription
dtype: string
- name: speaker_id
dtype: string
splits:
- name: train
num_bytes: 2580985002
num_examples: 7352
- name: validation
num_bytes: 140635947
num_examples: 407
- name: test
num_bytes: 147283597
num_examples: 419
download_size: 1414160353
dataset_size: 2868904546
---
# Dataset Card for "Experimental-Speech-Salt-Luganda-16k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikchar/retrieval_verification_bm25_roberta | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 66031496
num_examples: 11073
download_size: 30811974
dataset_size: 66031496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_bm25_roberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1 | ---
pretty_name: Evaluation run of ignos/LeoScorpius-GreenNode-Alpaca-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ignos/LeoScorpius-GreenNode-Alpaca-7B-v1](https://huggingface.co/ignos/LeoScorpius-GreenNode-Alpaca-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T16:30:42.646847](https://huggingface.co/datasets/open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1/blob/main/results_2023-12-16T16-30-42.646847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576641618199561,\n\
\ \"acc_stderr\": 0.03195573748726771,\n \"acc_norm\": 0.657374743909787,\n\
\ \"acc_norm_stderr\": 0.03261672779060913,\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140343,\n \"mc2\": 0.6935418249648612,\n\
\ \"mc2_stderr\": 0.015012402087814694\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002508,\n\
\ \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7094204341764588,\n\
\ \"acc_stderr\": 0.0045310191594141085,\n \"acc_norm\": 0.8815972913762199,\n\
\ \"acc_norm_stderr\": 0.0032242407223513204\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741617,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741617\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n\
\ \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n\
\ \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.017399335280140343,\n \"mc2\": 0.6935418249648612,\n\
\ \"mc2_stderr\": 0.015012402087814694\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.01072192328791875\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.01249392734865963\n }\n}\n```"
repo_url: https://huggingface.co/ignos/LeoScorpius-GreenNode-Alpaca-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-30-42.646847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-30-42.646847.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- '**/details_harness|winogrande|5_2023-12-16T16-30-42.646847.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T16-30-42.646847.parquet'
- config_name: results
data_files:
- split: 2023_12_16T16_30_42.646847
path:
- results_2023-12-16T16-30-42.646847.parquet
- split: latest
path:
- results_2023-12-16T16-30-42.646847.parquet
---
# Dataset Card for Evaluation run of ignos/LeoScorpius-GreenNode-Alpaca-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ignos/LeoScorpius-GreenNode-Alpaca-7B-v1](https://huggingface.co/ignos/LeoScorpius-GreenNode-Alpaca-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T16:30:42.646847](https://huggingface.co/datasets/open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1/blob/main/results_2023-12-16T16-30-42.646847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576641618199561,
"acc_stderr": 0.03195573748726771,
"acc_norm": 0.657374743909787,
"acc_norm_stderr": 0.03261672779060913,
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140343,
"mc2": 0.6935418249648612,
"mc2_stderr": 0.015012402087814694
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252423
},
"harness|hellaswag|10": {
"acc": 0.7094204341764588,
"acc_stderr": 0.0045310191594141085,
"acc_norm": 0.8815972913762199,
"acc_norm_stderr": 0.0032242407223513204
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741617,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741617
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.554467564259486,
"mc1_stderr": 0.017399335280140343,
"mc2": 0.6935418249648612,
"mc2_stderr": 0.015012402087814694
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.01072192328791875
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hails/agieval-lsat-rc | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 1136305
num_examples: 269
download_size: 322728
dataset_size: 1136305
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "agieval-lsat-rc"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the LSAT reading comprehension subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
```
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
Please make sure to cite all the individual datasets in your paper when you use them. We provide the relevant citation information below:
```
@inproceedings{ling-etal-2017-program,
title = "Program Induction by Rationale Generation: Learning to Solve and Explain Algebraic Word Problems",
author = "Ling, Wang and
Yogatama, Dani and
Dyer, Chris and
Blunsom, Phil",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1015",
doi = "10.18653/v1/P17-1015",
pages = "158--167",
abstract = "Solving algebraic word problems requires executing a series of arithmetic operations{---}a program{---}to obtain a final answer. However, since programs can be arbitrarily complicated, inducing them directly from question-answer pairs is a formidable challenge. To make this task more feasible, we solve these problems by generating answer rationales, sequences of natural language and human-readable mathematical expressions that derive the final answer through a series of small steps. Although rationales do not explicitly specify programs, they provide a scaffolding for their structure via intermediate milestones. To evaluate our approach, we have created a new 100,000-sample dataset of questions, answers and rationales. Experimental results show that indirect supervision of program learning via answer rationales is a promising strategy for inducing arithmetic programs.",
}
@inproceedings{hendrycksmath2021,
title={Measuring Mathematical Problem Solving With the MATH Dataset},
author={Dan Hendrycks and Collin Burns and Saurav Kadavath and Akul Arora and Steven Basart and Eric Tang and Dawn Song and Jacob Steinhardt},
journal={NeurIPS},
year={2021}
}
@inproceedings{Liu2020LogiQAAC,
title={LogiQA: A Challenge Dataset for Machine Reading Comprehension with Logical Reasoning},
author={Jian Liu and Leyang Cui and Hanmeng Liu and Dandan Huang and Yile Wang and Yue Zhang},
booktitle={International Joint Conference on Artificial Intelligence},
year={2020}
}
@inproceedings{zhong2019jec,
title={JEC-QA: A Legal-Domain Question Answering Dataset},
author={Zhong, Haoxi and Xiao, Chaojun and Tu, Cunchao and Zhang, Tianyang and Liu, Zhiyuan and Sun, Maosong},
booktitle={Proceedings of AAAI},
year={2020},
}
@article{Wang2021FromLT,
title={From LSAT: The Progress and Challenges of Complex Reasoning},
author={Siyuan Wang and Zhongkun Liu and Wanjun Zhong and Ming Zhou and Zhongyu Wei and Zhumin Chen and Nan Duan},
journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
year={2021},
volume={30},
pages={2201-2216}
}
``` |
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF | ---
pretty_name: Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Wizard-Vicuna-13B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T01:03:04.641003](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF/blob/main/results_2023-10-23T01-03-04.641003.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14314177852348994,\n\
\ \"em_stderr\": 0.0035865537174832513,\n \"f1\": 0.2178586409395965,\n\
\ \"f1_stderr\": 0.003730334446277459,\n \"acc\": 0.4216675951562166,\n\
\ \"acc_stderr\": 0.00989785498376742\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.14314177852348994,\n \"em_stderr\": 0.0035865537174832513,\n\
\ \"f1\": 0.2178586409395965,\n \"f1_stderr\": 0.003730334446277459\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.0077400443371038056\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431032\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|arc:challenge|25_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T01_03_04.641003
path:
- '**/details_harness|drop|3_2023-10-23T01-03-04.641003.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T01-03-04.641003.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T01_03_04.641003
path:
- '**/details_harness|gsm8k|5_2023-10-23T01-03-04.641003.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T01-03-04.641003.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hellaswag|10_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T16:17:31.150663.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T16:17:31.150663.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T01_03_04.641003
path:
- '**/details_harness|winogrande|5_2023-10-23T01-03-04.641003.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T01-03-04.641003.parquet'
- config_name: results
data_files:
- split: 2023_07_18T16_17_31.150663
path:
- results_2023-07-18T16:17:31.150663.parquet
- split: 2023_10_23T01_03_04.641003
path:
- results_2023-10-23T01-03-04.641003.parquet
- split: latest
path:
- results_2023-10-23T01-03-04.641003.parquet
---
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-13B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T01:03:04.641003](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF/blob/main/results_2023-10-23T01-03-04.641003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14314177852348994,
"em_stderr": 0.0035865537174832513,
"f1": 0.2178586409395965,
"f1_stderr": 0.003730334446277459,
"acc": 0.4216675951562166,
"acc_stderr": 0.00989785498376742
},
"harness|drop|3": {
"em": 0.14314177852348994,
"em_stderr": 0.0035865537174832513,
"f1": 0.2178586409395965,
"f1_stderr": 0.003730334446277459
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.0077400443371038056
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431032
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nerfgun3/cute_style | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Cute Style Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
This Style doesnt really has a specific theme, it just turns the expression of girls into "cute"
To use it in a prompt: ```"drawn by cute_style"```
If it is to strong just add [] around it.
Trained until 6000 steps
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/vDjSy5c.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/wXBNJNX.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/e3gremJ.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/jpYyj96.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/hUVuj9N.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
reza-alipour/CelebA-HQ2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: mm_id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: landmark
dtype: image
- name: landmark_cropped
dtype: image
- name: captions
sequence: string
- name: captions_eng
sequence: string
- name: captions_pes
sequence: string
- name: captions_fra
sequence: string
- name: captions_deu
sequence: string
- name: captions_ita
sequence: string
- name: captions_spa
sequence: string
- name: captions_all
sequence: string
splits:
- name: train
num_bytes: 3726486314.5
num_examples: 28495
- name: test
num_bytes: 222791982.0
num_examples: 1498
download_size: 3666582799
dataset_size: 3949278296.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/acasta_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of acasta/アカスタ/阿卡司塔 (Azur Lane)
This is the dataset of acasta/アカスタ/阿卡司塔 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `black_hair, blue_eyes, bangs, breasts, short_hair, hat, multicolored_hair, blue_hair, bow, one_side_up, large_breasts, ribbon, beret, blue_bow, blunt_bangs, hair_bow, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 23.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 30.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 21.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 37.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/acasta_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_shirt, black_skirt, closed_mouth, collared_shirt, long_sleeves, bag, blue_headwear, full_body, pleated_skirt, simple_background, black_choker, black_footwear, black_thighhighs, boots, chibi, coat, medium_hair, open_jacket, own_hands_together, shoes, sitting, smile, twitter_username, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, blue_skirt, long_sleeves, simple_background, white_background, blush, looking_at_viewer, pleated_skirt, solo, white_thighhighs, frilled_skirt, cannon, garter_straps, high-waist_skirt, holding, loafers, machinery, medium_breasts, turret, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | white_shirt | black_skirt | closed_mouth | collared_shirt | long_sleeves | bag | blue_headwear | full_body | pleated_skirt | simple_background | black_choker | black_footwear | black_thighhighs | boots | chibi | coat | medium_hair | open_jacket | own_hands_together | shoes | sitting | smile | twitter_username | white_background | blue_skirt | white_thighhighs | frilled_skirt | cannon | garter_straps | high-waist_skirt | holding | loafers | machinery | medium_breasts | turret |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:--------------|:---------------|:-----------------|:---------------|:------|:----------------|:------------|:----------------|:--------------------|:---------------|:-----------------|:-------------------|:--------|:--------|:-------|:--------------|:--------------|:---------------------|:--------|:----------|:--------|:-------------------|:-------------------|:-------------|:-------------------|:----------------|:---------|:----------------|:-------------------|:----------|:----------|:------------|:-----------------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
kiringodhwani/msp6newmodel | ---
dataset_info:
features:
- name: From
sequence: string
- name: Sent
sequence: string
- name: To
sequence: string
- name: Cc
sequence: string
- name: Subject
sequence: string
- name: Attachment
sequence: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 4381100
num_examples: 3079
download_size: 1929565
dataset_size: 4381100
---
# Dataset Card for "msp6newmodel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vallllll321/MyVoice | ---
license: openrail
---
|
vkaradeniz/moneypay_sss_2 | ---
dataset_info:
features:
- name: input
dtype: int64
- name: instruction
dtype: string
- name: output
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 24455
num_examples: 74
download_size: 16040
dataset_size: 24455
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp | ---
pretty_name: Evaluation run of mlabonne/NeuralPipe-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/NeuralPipe-7B-slerp](https://huggingface.co/mlabonne/NeuralPipe-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T12:33:01.505276](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp/blob/main/results_2024-01-05T12-33-01.505276.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445269708058093,\n\
\ \"acc_stderr\": 0.03218714474134609,\n \"acc_norm\": 0.6449418405596148,\n\
\ \"acc_norm_stderr\": 0.03284511879516387,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\
\ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518829\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n\
\ \"acc_stderr\": 0.0046918486653990685,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.003445289925011734\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\
\ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \
\ \"acc_stderr\": 0.012824066621488845\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/NeuralPipe-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|arc:challenge|25_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|arc:challenge|25_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|gsm8k|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|gsm8k|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hellaswag|10_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hellaswag|10_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T17-44-55.770154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T12-33-01.505276.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- '**/details_harness|winogrande|5_2023-12-29T17-44-55.770154.parquet'
- split: 2024_01_05T12_33_01.505276
path:
- '**/details_harness|winogrande|5_2024-01-05T12-33-01.505276.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T12-33-01.505276.parquet'
- config_name: results
data_files:
- split: 2023_12_29T17_44_55.770154
path:
- results_2023-12-29T17-44-55.770154.parquet
- split: 2024_01_05T12_33_01.505276
path:
- results_2024-01-05T12-33-01.505276.parquet
- split: latest
path:
- results_2024-01-05T12-33-01.505276.parquet
---
# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralPipe-7B-slerp](https://huggingface.co/mlabonne/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:33:01.505276](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp/blob/main/results_2024-01-05T12-33-01.505276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445269708058093,
"acc_stderr": 0.03218714474134609,
"acc_norm": 0.6449418405596148,
"acc_norm_stderr": 0.03284511879516387,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518829
},
"harness|hellaswag|10": {
"acc": 0.6701852220673172,
"acc_stderr": 0.0046918486653990685,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.012824066621488845
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TagsTest2024/tiny_llava_2 | ---
dataset_info:
features:
- name: src
dtype: string
- name: prompt
dtype: string
- name: tiny_llava
dtype: string
splits:
- name: anime_sfw_5000_test1
num_bytes: 7430026
num_examples: 5000
download_size: 3774630
dataset_size: 7430026
configs:
- config_name: default
data_files:
- split: anime_sfw_5000_test1
path: data/anime_sfw_5000_test1-*
---
|
chenghao/cuad_qa | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- closed-domain-qa
- extractive-qa
paperswithcode_id: cuad
pretty_name: CUAD
train-eval-index:
- config: default
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: test
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: cuad
name: CUAD
---
# Dataset Card for CUAD
This is a modified version of original [CUAD](https://huggingface.co/datasets/cuad/blob/main/README.md) which trims the question to its label form.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Contract Understanding Atticus Dataset](https://www.atticusprojectai.org/cuad)
- **Repository:** [Contract Understanding Atticus Dataset](https://github.com/TheAtticusProject/cuad/)
- **Paper:** [CUAD: An Expert-Annotated NLP Dataset for Legal Contract Review](https://arxiv.org/abs/2103.06268)
- **Point of Contact:** [Atticus Project Team](info@atticusprojectai.org)
### Dataset Summary
Contract Understanding Atticus Dataset (CUAD) v1 is a corpus of more than 13,000 labels in 510 commercial legal contracts that have been manually labeled to identify 41 categories of important clauses that lawyers look for when reviewing contracts in connection with corporate transactions.
CUAD is curated and maintained by The Atticus Project, Inc. to support NLP research and development in legal contract review. Analysis of CUAD can be found at https://arxiv.org/abs/2103.06268. Code for replicating the results and the trained model can be found at https://github.com/TheAtticusProject/cuad.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset contains samples in English only.
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [44],
"text": ['DISTRIBUTOR AGREEMENT']
},
"context": 'EXHIBIT 10.6\n\n DISTRIBUTOR AGREEMENT\n\n THIS DISTRIBUTOR AGREEMENT (the "Agreement") is made by and between Electric City Corp., a Delaware corporation ("Company") and Electric City of Illinois LLC ("Distributor") this 7th day of September, 1999...',
"id": "LIMEENERGYCO_09_09_1999-EX-10-DISTRIBUTOR AGREEMENT__Document Name_0",
"question": "Highlight the parts (if any) of this contract related to "Document Name" that should be reviewed by a lawyer. Details: The name of the contract",
"title": "LIMEENERGYCO_09_09_1999-EX-10-DISTRIBUTOR AGREEMENT"
}
```
### Data Fields
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
This dataset is split into train/test set. Number of samples in each set is given below:
| | Train | Test |
| ----- | ------ | ---- |
| CUAD | 22450 | 4182 |
## Dataset Creation
### Curation Rationale
A highly valuable specialized task without a public large-scale dataset is contract review, which costs humans substantial time, money, and attention. Many law firms spend approximately 50% of their time reviewing contracts (CEB, 2017). Due to the specialized training necessary to understand and interpret contracts, the billing rates for lawyers at large law firms are typically around $500-$900 per hour in the US. As a result, many transactions cost companies hundreds of thousands of dollars just so that lawyers can verify that there are no problematic obligations or requirements included in the contracts. Contract review can be a source of drudgery and, in comparison to other legal tasks, is widely considered to be especially boring.
Contract review costs also affect consumers. Since contract review costs are so prohibitive, contract review is not often performed outside corporate transactions. Small companies and individuals consequently often sign contracts without even reading them, which can result in predatory behavior that harms consumers. Automating contract review by openly releasing high-quality data and fine-tuned models can increase access to legal support for small businesses and individuals, so that legal support is not exclusively available to wealthy companies.
To reduce the disparate societal costs of contract review, and to study how well NLP models generalize to specialized domains, the authors introduced a new large-scale dataset for contract review. As part of The Atticus Project, a non-profit organization of legal experts, CUAD is introduced, the Contract Understanding Atticus Dataset. This dataset was created with a year-long effort pushed forward by dozens of law student annotators, lawyers, and machine learning researchers. The dataset includes more than 500 contracts and more than 13,000 expert annotations that span 41 label categories. For each of 41 different labels, models must learn to highlight the portions of a contract most salient to that label. This makes the task a matter of finding needles in a haystack.
### Source Data
#### Initial Data Collection and Normalization
The CUAD includes commercial contracts selected from 25 different types of contracts based on the contract names as shown below. Within each type, the creators randomly selected contracts based on the names of the filing companies across the alphabet.
Type of Contracts: # of Docs
Affiliate Agreement: 10
Agency Agreement: 13
Collaboration/Cooperation Agreement: 26
Co-Branding Agreement: 22
Consulting Agreement: 11
Development Agreement: 29
Distributor Agreement: 32
Endorsement Agreement: 24
Franchise Agreement: 15
Hosting Agreement: 20
IP Agreement: 17
Joint Venture Agreemen: 23
License Agreement: 33
Maintenance Agreement: 34
Manufacturing Agreement: 17
Marketing Agreement: 17
Non-Compete/No-Solicit/Non-Disparagement Agreement: 3
Outsourcing Agreement: 18
Promotion Agreement: 12
Reseller Agreement: 12
Service Agreement: 28
Sponsorship Agreement: 31
Supply Agreement: 18
Strategic Alliance Agreement: 32
Transportation Agreement: 13
TOTAL: 510
#### Who are the source language producers?
The contracts were sourced from EDGAR, the Electronic Data Gathering, Analysis, and Retrieval system used at the U.S. Securities and Exchange Commission (SEC). Publicly traded companies in the United States are required to file certain contracts under the SEC rules. Access to these contracts is available to the public for free at https://www.sec.gov/edgar. Please read the Datasheet at https://www.atticusprojectai.org/ for information on the intended use and limitations of the CUAD.
### Annotations
#### Annotation process
The labeling process included multiple steps to ensure accuracy:
1. Law Student Training: law students attended training sessions on each of the categories that included a summary, video instructions by experienced attorneys, multiple quizzes and workshops. Students were then required to label sample contracts in eBrevia, an online contract review tool. The initial training took approximately 70-100 hours.
2. Law Student Label: law students conducted manual contract review and labeling in eBrevia.
3. Key Word Search: law students conducted keyword search in eBrevia to capture additional categories that have been missed during the “Student Label” step.
4. Category-by-Category Report Review: law students exported the labeled clauses into reports, review each clause category-by-category and highlight clauses that they believe are mislabeled.
5. Attorney Review: experienced attorneys reviewed the category-by-category report with students comments, provided comments and addressed student questions. When applicable, attorneys discussed such results with the students and reached consensus. Students made changes in eBrevia accordingly.
6. eBrevia Extras Review. Attorneys and students used eBrevia to generate a list of “extras”, which are clauses that eBrevia AI tool identified as responsive to a category but not labeled by human annotators. Attorneys and students reviewed all of the “extras” and added the correct ones. The process is repeated until all or substantially all of the “extras” are incorrect labels.
7. Final Report: The final report was exported into a CSV file. Volunteers manually added the “Yes/No” answer column to categories that do not contain an answer.
#### Who are the annotators?
Answered in above section.
### Personal and Sensitive Information
Some clauses in the files are redacted because the party submitting these contracts redacted them to protect confidentiality. Such redaction may show up as asterisks (\*\*\*) or underscores (\_\_\_) or blank spaces. The dataset and the answers reflect such redactions. For example, the answer for “January \_\_ 2020” would be “1/[]/2020”).
For any categories that require an answer of “Yes/No”, annotators include full sentences as text context in a contract. To maintain consistency and minimize inter-annotator disagreement, annotators select text for the full sentence, under the instruction of “from period to period”.
For the other categories, annotators selected segments of the text in the contract that are responsive to each such category. One category in a contract may include multiple labels. For example, “Parties” may include 4-10 separate text strings that are not continuous in a contract. The answer is presented in the unified format separated by semicolons of “Party A Inc. (“Party A”); Party B Corp. (“Party B”)”.
Some sentences in the files include confidential legends that are not part of the contracts. An example of such confidential legend is as follows:
THIS EXHIBIT HAS BEEN REDACTED AND IS THE SUBJECT OF A CONFIDENTIAL TREATMENT REQUEST. REDACTED MATERIAL IS MARKED WITH [* * *] AND HAS BEEN FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION.
Some sentences in the files contain irrelevant information such as footers or page numbers. Some sentences may not be relevant to the corresponding category. Some sentences may correspond to a different category. Because many legal clauses are very long and contain various sub-parts, sometimes only a sub-part of a sentence is responsive to a category.
To address the foregoing limitations, annotators manually deleted the portion that is not responsive, replacing it with the symbol "<omitted>" to indicate that the two text segments do not appear immediately next to each other in the contracts. For example, if a “Termination for Convenience” clause starts with “Each Party may terminate this Agreement if” followed by three subparts “(a), (b) and (c)”, but only subpart (c) is responsive to this category, the authors manually deleted subparts (a) and (b) and replaced them with the symbol "<omitted>”. Another example is for “Effective Date”, the contract includes a sentence “This Agreement is effective as of the date written above” that appears after the date “January 1, 2010”. The annotation is as follows: “January 1, 2010 <omitted> This Agreement is effective as of the date written above.”
Because the contracts were converted from PDF into TXT files, the converted TXT files may not stay true to the format of the original PDF files. For example, some contracts contain inconsistent spacing between words, sentences and paragraphs. Table format is not maintained in the TXT files.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Attorney Advisors
Wei Chen, John Brockland, Kevin Chen, Jacky Fink, Spencer P. Goodson, Justin Haan, Alex Haskell, Kari Krusmark, Jenny Lin, Jonas Marson, Benjamin Petersen, Alexander Kwonji Rosenberg, William R. Sawyers, Brittany Schmeltz, Max Scott, Zhu Zhu
Law Student Leaders
John Batoha, Daisy Beckner, Lovina Consunji, Gina Diaz, Chris Gronseth, Calvin Hannagan, Joseph Kroon, Sheetal Sharma Saran
Law Student Contributors
Scott Aronin, Bryan Burgoon, Jigar Desai, Imani Haynes, Jeongsoo Kim, Margaret Lynch, Allison Melville, Felix Mendez-Burgos, Nicole Mirkazemi, David Myers, Emily Rissberger, Behrang Seraj, Sarahginy Valcin
Technical Advisors & Contributors
Dan Hendrycks, Collin Burns, Spencer Ball, Anya Chen
### Licensing Information
CUAD is licensed under the Creative Commons Attribution 4.0 (CC BY 4.0) license and free to the public for commercial and non-commercial use.
The creators make no representations or warranties regarding the license status of the underlying contracts, which are publicly available and downloadable from EDGAR.
Privacy Policy & Disclaimers
The categories or the contracts included in the dataset are not comprehensive or representative. The authors encourage the public to help improve them by sending them your comments and suggestions to info@atticusprojectai.org. Comments and suggestions will be reviewed by The Atticus Project at its discretion and will be included in future versions of Atticus categories once approved.
The use of CUAD is subject to their privacy policy https://www.atticusprojectai.org/privacy-policy and disclaimer https://www.atticusprojectai.org/disclaimer.
### Citation Information
```
@article{hendrycks2021cuad,
title={CUAD: An Expert-Annotated NLP Dataset for Legal Contract Review},
author={Dan Hendrycks and Collin Burns and Anya Chen and Spencer Ball},
journal={arXiv preprint arXiv:2103.06268},
year={2021}
}
```
### Contributions
Thanks to [@bhavitvyamalik](https://github.com/bhavitvyamalik) for adding the original CUAD dataset. |
CHO0522/customehkcode2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12241
num_examples: 5
download_size: 18770
dataset_size: 12241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gaizerick/nasalome | ---
license: openrail
---
|
vicaloy/test-raft-theoric | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38341313
num_examples: 37001
- name: test
num_bytes: 18645038
num_examples: 13059
download_size: 13244489
dataset_size: 56986351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
erfanvaredi/sql-create-context-ins | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 17752821
num_examples: 78577
download_size: 6800583
dataset_size: 17752821
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Hello. This is forked version of [b-mc2/sql-create-context](https://huggingface.co/datasets/b-mc2/sql-create-context).
In this repo instead of having context, all context text migrated as a part of question.
You can use this dataset to finetune your model for `test2sql` tasks.
## Use With datasets Of HuggingFace
1. First install libs:
```sh
pip install -Uq datasets huggingface_hub
```
2. Load the dataset:
```py
from datasets import load_dataset
ds = load_dataset('erfanvaredi/sql-create-context-ins')
ds
```
Output:
```python
DatasetDict({
train: Dataset({
features: ['question', 'answer'],
num_rows: 78577
})
})
```
## LLaMA-Factory
It can be easily used by LLaMA-Factory as well.
For using this dataset in LLaMA-Factory you just need to go through the below steps.
1. Clone LLaMA-Factory
2. Go to `LLaMA-Factory/data/` and open `dataset_info.json`
3. Add below json:
```json
"text2sql": {
"hf_hub_url": "erfanvaredi/sql-create-context-ins",
"columns": {
"prompt": "question",
"response": "answer"
}
}
``` |
bigscience-data/roots_en_wikibooks | ---
language: en
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_en_wikibooks
# wikibooks_filtered
- Dataset uid: `wikibooks_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0897 % of total
- 0.2591 % of en
- 0.0965 % of fr
- 0.1691 % of es
- 0.2834 % of indic-hi
- 0.2172 % of pt
- 0.0149 % of zh
- 0.0279 % of ar
- 0.1374 % of vi
- 0.5025 % of id
- 0.3694 % of indic-ur
- 0.5744 % of eu
- 0.0769 % of ca
- 0.0519 % of indic-ta
- 0.1470 % of indic-mr
- 0.0751 % of indic-te
- 0.0156 % of indic-bn
- 0.0476 % of indic-ml
- 0.0087 % of indic-pa
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-pa
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
H34lthy/Isotropy | ---
license: mit
---
|
OdiaGenAI/odia_master_data_llama2 | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- or
pretty_name: odia_master_data_llama2
size_categories:
- 100K<n<1M
---
# Dataset Card for odia_master_data_llama2
## Dataset Description
- **Homepage: https://www.odiagenai.org/**
- **Repository: https://github.com/shantipriyap/OdiaGenAI**
- **Point of Contact: Shantipriya Parida, and Sambit Sekhar**
### Dataset Summary
This dataset is a mix of Odia instruction sets translated from open-source instruction sets and Odia domain knowledge instruction sets.
The Odia instruction sets used are:
* odia_domain_context_train_v1
* dolly-odia-15k
* OdiEnCorp_translation_instructions_25k
* gpt-teacher-roleplay-odia-3k
* Odia_Alpaca_instructions_52k
* hardcode_odia_qa_105
In this dataset Odia instruction, input, and output strings are available.
### Supported Tasks and Leaderboards
Large Language Model (LLM)
### Languages
Odia
## Dataset Structure
JSON
### Data Fields
output (string)
instruction (string)
input (string)
### Licensing Information
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
### Citation Information
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{odia_master_data_llama2,
author = {Shantipriya Parida and Sambit Sekhar and Aisha Asif and Subham Pradhan and Guneet Singh Kohli and Swateek Jena},
title = {Large Odia Instruction Set for LlaMA2 Finetuning},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
### Contributions
- Shantipriya Parida (Silo AI, Helsinki, Finland)
- Sambit Sekhar (Odia Generative AI, Bhubaneswar, India)
- Aisha Asif (KIIT, University, Bhubaneswar, India)
- Subham Pradhan (Silicon Institute of Technology, Bhubaneswar, India)
- Guneet Singh Kohli (Thapar Institute of Engineering and Technology, India)
- Swateek Jena (RightSense Inc, USA)
|
DynamicSuperb/AutomaticSpeechRecognition_LibriSpeech-TestOther | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 352426584.188
num_examples: 2939
download_size: 332888539
dataset_size: 352426584.188
---
# Dataset Card for "AutomaticSpeechRecognition_LibriSpeech-TestOther"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
belloIsMiaoMa/img-1Hmeow-32hop | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 9197000.348
num_examples: 3009
download_size: 9772858
dataset_size: 9197000.348
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_drop_aux_have | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 228312
num_examples: 983
- name: dev_mismatched
num_bytes: 304679
num_examples: 1230
- name: test_matched
num_bytes: 234389
num_examples: 970
- name: test_mismatched
num_bytes: 302132
num_examples: 1240
- name: train
num_bytes: 9212551
num_examples: 39238
download_size: 6455628
dataset_size: 10282063
---
# Dataset Card for "MULTI_VALUE_mnli_drop_aux_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bananabot/TrumpSpeeches | ---
license: wtfpl
---
|
Mihaj/ruoh_demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: mother_tongue
dtype: string
- name: region
dtype: string
- name: gender
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 1600232223.61
num_examples: 13198
- name: test
num_bytes: 405584868.6
num_examples: 3300
download_size: 1960524339
dataset_size: 2005817092.21
---
# Dataset Card for "ruoh_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UtkuC/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41868.0
num_examples: 7
download_size: 42992
dataset_size: 41868.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/a6b86698 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1341
dataset_size: 182
---
# Dataset Card for "a6b86698"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/mistral_captions | ---
license: apache-2.0
configs:
- config_name: default
data_files: captions.jsonl
---
|
yiyic/mtg_de | ---
language:
- de
--- |
ellenlnt/ASCOR_audio2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: file_name
dtype: string
- name: ID
dtype: string
- name: title
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 1482575793.0
num_examples: 250
download_size: 1465127520
dataset_size: 1482575793.0
---
# Dataset Card for "ASCOR_audio2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
avduarte333/BookTection | ---
license: mit
task_categories:
- question-answering
language:
- en
---
# 📚 BookTection Dataset
The BookTection dataset serves as a benchmark designed for the task of detecting pretraining data from Large Language models.
The dataset consists of 165 books.
- 60 published in 2023: Non-Training data, "_label_" column = 0.
- 105 published before 2022: Training data, "_label_" column = 1.
From each book ≈ 34 passages are extracted. Each passage is paraphrased 3 times using the Language Model Claude v2.0. <br>
The "_Answer_" column indicates which of the passages is the real excerpt.<br>
Passages come in 3 different sizes (small, medium and large) which aim to be respectively ≈(64, 128 and 256) tokens in length.
<br>
# 🧪 Testing Models on BookTection
Our dataset is planned to be used on a Multiple-Choice-Question-Answering format. Nonetheless, it is compatible to be used with other pretraining data detection methods.<br>
Our [GitHub](https://github.com/avduarte333/DE-COP_Method) repository contains example scripts to evaluate models on our dataset.
<br>
# 🤝 Compatibility
The Multiple-Choice-Question-Answering task with our Dataset is designed to be applied to various models, such as:<br>
- LLaMA-2
- Mistral
- Mixtral
- Chat-GPT (gpt-3.5-turbo-instruct)
- GPT-3 (text-davinci-003)
- Claude
<br>
# 🔧 Loading the Dataset
```python
from datasets import load_dataset
dataset = load_dataset("avduarte333/BookTection")
```
<br>
# 💬 Citation
```bibtex
@misc{duarte2024decop,
title={{DE-COP: Detecting Copyrighted Content in Language Models Training Data}},
author={André V. Duarte and Xuandong Zhao and Arlindo L. Oliveira and Lei Li},
year={2024},
eprint={2402.09910},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<details>
<summary> 📖 Book References</summary>
[1] Orwell, G. (2021). Nineteen Eighty-Four. Penguin Classics.<br>
[2] Martin, George R. R., author. (1996). A game of thrones. New York :Bantam Books,<br>
[3] Joyce, J. (1992). A portrait of the artist as a young man. Wordsworth Editions.<br>
[4] Dickens, C. (2012). A tale of two cities. Penguin Classics.<br>
[5] Twain, M. (2010). The adventures of huckleberry Finn. William Collins.<br>
[6] Carroll, L. (2015). Alice’s adventures in wonderland (M. Burstein, Ed.). Princeton University Press.<br>
[7] Doerr, A. (2015). All the light we cannot see. Fourth Estate.<br>
[8] Christie, A. (2003). And then there were none. HarperCollins.<br>
[9] Brown, D. (2000). Angels & demons. 1st large print ed. New York, Random House Large Print.<br>
[10] Montgomery, L. M. (2018). Anne of Green Gables. Wordsworth Editions.<br>
[11] Melville, H. (2004). Bartleby The Scrivener. Melville House Publishing.<br>
[12] Sewell, A. (2018). Black Beauty. Wordsworth Editions.<br>
[13] Meyer, S. (2009). Breaking dawn. 1st special ed. London, Little, Brown Children.<br>
[14] Ian, F. (2002). Casino Royale. Penguin Books.<br>
[15] Heller, Joseph, 1798-1849. (1961). Catch-22, a novel. New York :The Modern library,<br>
[16] Dahl, R. (2016). Charlie and the chocolate factory. Puffin.<br>
[17] Dickens, C., & de Gavin, A. (1992). David Copperfield. Wordsworth Editions.<br>
[18] Stoker, B. (1993). Dracula. Wordsworth Editions.<br>
[19] Herbert, F. (2006). Dune. Hodder Paperback.<br>
[20] Meyer, S. (2008). Eclipse. ATOM.<br>
[21] Austen, J. (1992). Emma. Wordsworth Editions.<br>
[22] Follett, K. (1978). Eye of the needle: a novel. New York, Arbor House.<br>
[23] Bradbury, R. (1992). Fahrenheit 451. Del Rey Books.<br>
[24] Jong, E. (1973). Fear of flying. [Book club ed.]. New York, Holt, Rinehart and Winston.<br>
[25] James, E. L. (2012). Fifty shades of grey. Random House.<br>
[26] James, E. L. (2017). Fifty shades darker. Vintage books movie tie-in edition. New York, Vintage Books.<br>
[27] Blyton, Enid. (1942). The Famous Five: Five on a treasure island . : .<br>
[28] Shelley, M. (2012). Frankenstein. Penguin Classics.<br>
[29] Flynn, G. (2014). Gone Girl. Weidenfeld & Nicolson.<br>
[30] Dickens, C. (1992). Great Expectations. Wordsworth Editions.<br>
[31] Rowling, J. K. (2014). Harry potter and the chamber of secrets. Bloomsbury Childrens Books.<br>
[32] Rowling, J. K. (2014). Harry potter and the deathly hallows. Bloomsbury Childrens Books.<br>
[33] Rowling, J. K. (2014). Harry potter and the goblet of fire. Bloomsbury Childrens Books.<br>
[34] Rowling, J. K. (2014). Harry potter and the half-blood prince. Bloomsbury Childrens Books.<br>
[35] Rowling, J. K. (2014). Harry potter and the order of the Phoenix. Bloomsbury Childrens Books.<br>
[36] Rowling, J. K. (2014). Harry potter and the philosopher’s stone. Bloomsbury Childrens Books.<br>
[37] Rowling, J. K. (2014). Harry potter and the prisoner of azkaban. Bloomsbury Childrens Books.<br>
[38] Adams, D. (2007). The hitchhiker’s guide to the galaxy. Random House.<br>
[39] Doyle, S. A. C. (2012). The hound of the baskervilles. Penguin Classics.<br>
[40] Collins, S. (2008). The Hunger Games. Scholastic.<br>
[41] Collins S. (2011). Catching Fire. Scholastic.<br>
[42] Collins, S. (2011). Mockingjay. Scholastic.<br>
[43] Brown, D. (2013). Inferno: a novel. 1st large print ed. New York, Random House Large Print.<br>
[44] Bronte, C. (1992). Jane Eyre. Wordsworth Editions.<br>
[45] Alcott, Louisa May, 1832-1888. (1953). Little women. Melbourne ; London ; Baltimore :Penguin Books,<br>
[46] Golding, W. (2011). Lord of the flies. Faber & Faber.<br>
[47] Tolkien, J. R. R. (1991). The lord of the rings. HarperCollins.<br>
[48] Stone, I. (1984). Lust for life. 50th anniversary ed. New York, A Plume Book.<br>
[49] Dahl, R. (1988). Matilda. Penguin.<br>
[50] Moyes, J. (2012). Me Before You. Michael Joseph.<br>
[51] Melville, Herman, 1819-1891. (2009). Moby-Dick. [Ashland, Or.] :Blackstone Audio,<br>
[52] Eco, U. (2004). The name of the Rose. Vintage Classics.<br>
[53] Meyer, S. (2008). New moon. 1st pbk. ed. New York, Little, Brown.<br>
[54] Maugham, W. S. 1., & Crossley, S. (2010). Of human bondage. Unabridged. Old Saybrook, CT, Tantor Media.<br>
[55] Dickens, C. (1992). Oliver Twist. Wordsworth Editions.<br>
[56] Brown, D. (2017). Origin: a novel. First large print edition. [New York], Random House Large Print.<br>
[57] Süskind, Patrick. (1986). Perfume : the story of a murderer. New York :A.A. Knopf,<br>
[58] Austen, J. (2003). Pride and Prejudice (V. Jones, Ed.). Penguin Classics.<br>
[59] Clavell, J. (1986). Shogun. Bantam Doubleday Dell Publishing Group.<br>
[60] Eliot, G. (2012). Silas Marner. Penguin Classics.<br>
[61] Gaarder, Jostein, 1952-. (1994). Sophie's world : a novel about the history of philosophy. New York :Farrar, Straus and Giroux,<br>
[62] Burroughs, Edgar Rice, 1875-1950. (2010). Tarzan of the apes. Oxford [England] ; New York :Oxford University Press,<br>
[63] Hardy, T. (2012). Tess of the D’Urbervilles. Penguin Classics.<br>
[64] Doyle, A. C. (1950). The adventures of Sherlock Holmes. New York, The Heritage Press.<br>
[65] Twain, M. (2011). The adventures of tom Sawyer. William Collins.<br>
[66] Wharton, E. (1994). The age of innocence. Wordsworth Editions.<br>
[67] Coelho, P. (1995). The alchemist. Thorsons.<br>
[68] Boyne, John, 1971-. (2006). The boy in the striped pajamas : a fable. New York :David Fickling Books,<br>
[69] London, J. (2008). The call of the wild. Puffin Classics.<br>
[70] Salinger, J. D. (2001). Catcher in the Rye. Back Bay Books.<br>
[71] Lewis, C. S. 1., & Baynes, P. (1994). Prince Caspian. New York, N.Y., HarperTrophy.<br>
[72] Lewis, C. S.. (1978). The Chronicles of Narnia : The Lion, the Witch and the Wardrobe . New York: Harper Collins Publisher.<br>
[73] Lewis, C. S.. (1980). The Chronicles of Narnia : The Voyage of the Dawn Treader . United States of America: Harper Collins Publisher.<br>
[74] Brown, D. (2006). The Da Vinci Code. Corgi Books.<br>
[75] Green, J. (2013). The Fault in Our Stars. Penguin Books.<br>
[76] Hawkins, Paula. The Girl on the Train. New York :Riverhead Books, a member of Penguin Group (USA), 2015.<br>
[77] Larsson, S. (2009). The girl with the dragon tattoo. Quercus Publishing.<br>
[78] Lowry, L. (2014). The Giver. HarperCollins.<br>
[79] Puzo, M. (2009). The Godfather. Arrow Books.<br>
[80] Fitzgerald, F. S. (2019). The great Gatsby. Wordsworth Editions.<br>
[81] Tolkien, J. R. R. (2012). The Hobbit. HarperCollins.<br>
[82] Hawthorne, Nathaniel, 1804-1864. The House of the Seven Gables. Charlottesville, Va. :University of Virginia Library, 1996.<br>
[83] Wells, H. G. 1. (1988). The invisible man. Aerie Books.<br>
[84] Brown, D. (2009). The lost symbol: a novel. New York, Doubleday.<br>
[85] Britannica, T. Editors of Encyclopaedia (2022, January 10). The Mysteries of Udolpho. Encyclopedia Britannica.<br>
[86] Hinton, S. E. (2016). The Outsiders. Penguin Books.<br>
[87] Wilde, O. (1992). The picture of Dorian gray. Wordsworth Editions.<br>
[88] Follett, K. (1990). Pillars of the earth. New York, N.Y., Signet.<br>
[89] Camus, A. (2020). The Plague. Penguin Classics.<br>
[90] Hawthorne, N. (2003). Scarlet Letter. Penguin Classics.<br>
[91] Townsend, Sue. (2003). The secret diary of Adrian Mole aged 13 3/4 . New York: Harper Collins.<br>
[92] Burnett, F. H. (2017). The Secret Garden. Virago Press.<br>
[93] Young, W. P. (2008). The Shack. Hodder & Stoughton.<br>
[94] Ruiz Zafón, C., & Graves, L. (2005). The shadow of the wind. New York, Penguin Books.<br>
[95] Tolkien, J. R. R. (1991). The Silmarillion. HarperCollins.<br>
[96] James, H. (1991). The turn of the screw. Dover Publications.<br>
[97] Hurston, Z. N. (2018). Their eyes were watching god. Virago Press.<br>
[98] Achebe, C. (2006). Things Fall Apart. Penguin Classics.<br>
[99] Lee, H. (2010). To kill A mockingbird. Arrow Books.<br>
[100] Stevenson, R. L. (2016). Treasure Island. Puffin Classics.<br>
[101] Meyer, S. (2007). Twilight. ATOM.<br>
[102] Joyce, J. (2010). Ulysses. Wordsworth Editions.<br>
[103] Tolstoy, L. (1993). War and peace (L. Maude & A. Maude, Trans.). Wordsworth Editions.<br>
[104] Parkins, D., & Adams, R. (2014). Watership Down. Puffin Classics.<br>
[105] Bronte, E. (2012). Wuthering Heights. Penguin Classics.<br>
[106] Shannon, S. (2023). A day of fallen night. New York, Bloomsbury Publishing.<br>
[107] Chung, N. (2023). A living remedy: a memoir. First edition. New York, NY, Ecco.<br>
[108] Adebayo, A. (2023). A spell of good things: a novel. First edition. New York, Alfred A. Knopf.<br>
[109] Koontz, D. R. 1. (2023). After death. First edition. Seattle, Thomas & Mercer.<br>
[110] Patterson, J., & DuBois, B. (2022). Blowback. First edition. New York, Little, Brown and Company.<br>
[111] Solomon, R. L. (2023). Business or pleasure. First edition. New York, Berkley Romance.<br>
[112] Casati, C. (2023). Clytemnestra: a novel. Naperville, Illinois, Sourcebooks Landmark.<br>
[113] Smith, T. R. (2023). Cold people: a novel. First Scribner hardcover edition. New York, Scribner.<br>
[114] Frumkin, R. (2023). Confidence: a novel. First Simon & Schuster hardcover edition. New York, Simon & Schuster.<br>
[115] Ross, R. (2023). Divine rivals: a novel. First edition. New York, Wednesday Books.<br>
[116] Fawcett, H. (2023). Emily Wilde's encyclopaedia of faeries. First edition. New York, Del Rey.<br>
[117] Rivero, M. (2023). Flores and Miss Paula: a novel. First edition. New York, Ecco, an imprint of HarperCollins Publishers.<br>
[118] Henry, E. (2023). Happy place. New York, Berkley.<br>
[119] Delury, J. (2023). Hedge: a novel. New York, Zibby Books.<br>
[120] Bardugo, L. (2023). Hell bent. First edition. New York, NY, Flatiron Books.<br>
[121] Center, K. (2023). Hello stranger. First edition. New York, St. Martin's Press.<br>
[122] Hibbert, T. (2023). Highly suspicious and unfairly cute. First edition. New York, Joy Revolution, of Random House Children's Books.<br>
[123] Burton, J. (2023). Housebroke. First edition. New York, Berkley Romance.<br>
[124] Hendrix, G. (2023). How to sell a haunted house. New York, Berkley.<br>
[125] Makkai, R. (2023). I have some questions for you. [New York], Viking.<br>
[126] Klune, T. (2023). In the lives of puppets. First edition. New York, Tor Publishing Group.<br>
[127] Boyle, L. (2023). In the Silence of Decay. [United States], Lisa Boyle.<br>
[128] See, L. (2023). Lady Tan's circle of women: a novel. First Scribner hardcover edition. New York, Scribner.<br>
[129] Hazelwood, A. (2023). Love, theoretically. New York, Berkley.<br>
[130] George, J. (2023). Maame. First edition. New York, St. Martin's Press.<br>
[131] Brown, A. (2023). Perfect little lives. Toronto, ON, Graydon House.<br>
[132] Irby, S. (2023). Quietly hostile: essays. New York, Vintage Books, a division of Penguin Random House LLC.<br>
[133] Harding, R. (2023). The drowning woman. First edition. New York, Grand Central Publishing.<br>
[134] Sittenfeld, C. (2023). Romantic comedy: a novel. First edition. New York, Random House.<br>
[135] Davis-Goff, S. (2023). Silent city. First U.S. Edition. New York, Flatiron Books.<br>
[136] Moreno-Garcia, S. (2023). Silver nitrate. First edition. New York, Del Rey.<br>
[137] Williams, D. (2023). Technically yours. First edition. New York, Berkley Romance.<br>
[138] Page, S. (2023). The Book of Beginnings. HarperCollins.<br>
[139] Ward, M. (2023). The Darkness Before Them. Orbit.<br>
[140] Kantra, V. (2023). The fairytale life of Dorothy Gale. First Edition. New York, Berkley.<br>
[141] Ferguson, L. (2023). The fake mate. First edition. New York, Berkley Romance.<br>
[142] Whitten, H. (2023). The foxglove king. First Edition. New York, NY, Orbit.<br>
[143] Hay, A. (2023). The housekeepers: a novel. Toronto, Ontario, Canada, Graydon House.<br>
[144] Johansen, E. (2023). The Kingdom of Sweets: a novel of The Nutcracker. [New York], Dutton.<br>
[145] Jackson, L. (2023). The last sinner. First Kensington hardcover edition. New York, NY, Kensington Publishing Corp.<br>
[146] Woods, E. (2023). The lost bookshop. Paperback edition. London, One More Chapter.<br>
[147] Sager, R. (2023). The only one left: a novel. New York, Dutton.<br>
[148] Ryan, R. (2023). The Paris housekeeper. Toronto, Ontario, Canada, Love Inspired.<br>
[149] Hashem, S. (2023). The Jasad heir. First edition. New York, NY, Orbit.<br>
[150] Leigh, J. (2023). The Silver Ladies Do Lunch, [United States], Boldwood Books.<br>
[151] Davis, F. (2023). The spectacular: a novel. New York, Dutton.<br>
[152] Harvey, K. W. (2023). The summer of songbirds: a novel. First Gallery Books hardcover edition. New York, Gallery Books.<br>
[153] Lauren, C. (2023). The True Love Experiment. First Gallery Books hardcover edition. New York, Gallery Books.<br>
[154] Lemmie, A. (2023). The wildest sun: a novel. [New York], Dutton.<br>
[155] Hatzopoulou, K. (2023). Threads that bind. New York, Razorbill.<br>
[156] Harkaway, N. (2023). Titanium noir: a novel. First edition. New York, Alfred A. Knopf.<br>
[157] Adams, K. J. (2023). Tonight, I burn. First edition. New York, NY, Orbit.<br>
[158] Parks, A. (2023). Two dead wives. Toronto, Mira.<br>
[159] Bailey, T. (2023). Unfortunately yours: a novel. First edition. New York, NY, Avon, an imprint of HarperCollinsPublishers.<br>
[160] McGinnis, M. W. (2023). USS Lincoln: Mercy Kill. Avenstar Productions<br>
[161] Sebastian, C. (2023). We could be so good: a novel. First edition. New York, NY, Avon, an imprint of HarperCollinsPublishers.<br>
[162] LeCheminant, D. (2023). What Dreams May Come. Covenant Communications.<br>
[163] Marshall, K. A. (2023). What lies in the woods. First edition. New York, Flatiron Books.<br>
[164] Kuang, R. F. (2023). Yellowface: a novel. First edition. New York, NY, William Morrow, an imprint of HarperCollinsPublishers.<br>
[165] Bayron, K. (2023). You're not supposed to die tonight. New York, Bloomsbury YA.<br>
</details> |
AdapterOcean/chemistry_dataset_standardized_cluster_0_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 14956432
num_examples: 11094
download_size: 6327359
dataset_size: 14956432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_cluster_0_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SDbiaseval/professions | ---
dataset_info:
features:
- name: adjective
dtype: string
- name: profession
dtype: string
- name: 'no'
dtype: int32
- name: image_path
dtype: string
- name: image
dtype: image
- name: model
dtype: string
splits:
- name: train
num_bytes: 3088839692.5
num_examples: 94500
download_size: 3075495491
dataset_size: 3088839692.5
---
# Dataset Card for "professions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/data-standardized_cluster_22_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 27474399
num_examples: 25474
download_size: 11969072
dataset_size: 27474399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_22_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Sample_vqa_test_for_colab | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
splits:
- name: test
num_bytes: 1599032.0
num_examples: 10
download_size: 1594854
dataset_size: 1599032.0
---
# Dataset Card for "Sample_vqa_test_for_colab"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
redwoodresearch/diamonds-seed2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: is_correct
dtype: bool
- name: is_clean
dtype: bool
- name: measurements
sequence: bool
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 61879472
num_examples: 25000
- name: validation
num_bytes: 20208009
num_examples: 7989
- name: train_for_val
num_bytes: 7489770
num_examples: 2997
download_size: 1102497
dataset_size: 89577251
---
# Dataset Card for "diamonds-seed2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.