datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Limour/perplexity | ---
license: cc-by-nc-sa-4.0
language:
- zh
tags:
- not-for-all-audiences
---
https://www.kaggle.com/code/reginliu/perplexity
| Model | Size | PPL | n_vocab | PPL_adjust |
|-------|---------|---------|---------|---------|
| [qwen1_5-14b-chat-IQ3_XS.gguf](https://huggingface.co/Limour/Qwen1.5-14B-Chat-GGUF/blob/main/qwen1_5-14b-chat-IQ3_XS.gguf) | 6.48 | 11.8084 +/- 0.121615 | 152064 | 11.8084 |
| [causallm_14b.IQ3_XS.gguf](https://huggingface.co/Limour/CausalLM-14B-GGUF/blob/main/causallm_14b.IQ3_XS.gguf) | 6.48 | 13.3798 +/- 0.13641 | 152064 | 13.3798 |
| [causallm_14b.IQ4_XS.gguf](https://huggingface.co/Limour/CausalLM-14B-GGUF/blob/main/causallm_14b.IQ4_XS.gguf) | 7.85 | 13.4127 +/- 0.13762 | 152064 | 13.4127 |
| [causallm_14b.Q4_0.gguf](https://huggingface.co/TheBloke/CausalLM-14B-GGUF/blob/main/causallm_14b.Q4_0.gguf) | 8.18 | 13.6714 +/- 0.13964 | 152064 | 13.6714 |
| [causallm_14b.IQ2_XXS.gguf](https://huggingface.co/Limour/CausalLM-14B-GGUF/blob/main/causallm_14b.IQ2_XXS.gguf) | 4.98 | 15.0160 +/- 0.15004 | 152064 | 15.0160 |
| [Yi-9B-200K_iQ3xxs.gguf](https://huggingface.co/MarsupialAI/Yi-9B-200K_iMatrix_GGUF/blob/main/Yi-9B-200K_iQ3xxs.gguf) | 3.47 | 6.8157 +/- 0.05453 | 64000 | 16.1941 |
| [Fi-9B-200K-Q8_0.gguf](https://huggingface.co/DisOOM/Fi-9B-GGUF/blob/main/Fi-9B-Q8_0.gguf) | 9.38 | 6.8402 +/- 0.05741 | 64000 | 16.2523 |
| [causallm_7b.Q5_K_M.gguf](https://huggingface.co/TheBloke/CausalLM-7B-GGUF/blob/main/causallm_7b.Q5_K_M.gguf) | 5.53 | 16.5278 +/- 0.18005 | 152064 | 16.5278 |
| [Qwen1.5-22B-Chat-Merge-Q4_0.gguf](https://huggingface.co/DisOOM/Qwen1.5-22B-Chat-Merge-GGUF/blob/main/Qwen1.5-22B-Chat-Merge-Q4_0.gguf) | 12.6 | 21.9669 +/- 0.28980 | 152064 | 21.9669 |
| [Kunoichi-DPO-v2-7B-Q4_K_M-imatrix.gguf](https://hf-mirror.com/Lewdiculous/Kunoichi-DPO-v2-7B-GGUF-Imatrix/blob/main/Kunoichi-DPO-v2-7B-Q4_K_M-imatrix.gguf) | 4.37 | 6.7096 +/- 0.04519 | 32000 | 31.8840 |
For a model that returns tokens completely at random, we have
$$ P(token|context) = \frac{1}{n_{vocab}}, \quad PPL = \sqrt[N]{\left(\frac{1}{P}\right)^N} = n_{vocab} $$
therefore
$$ PPL_{adjust} = \frac{PPL}{n_{vocab}} \times 152064 $$ |
japanese-asr/whisper_transcriptions.reazonspeech.all_49 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30365182032.0
num_examples: 267347
download_size: 30123643810
dataset_size: 30365182032.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
irds/beir_dbpedia-entity_test | ---
pretty_name: '`beir/dbpedia-entity/test`'
viewer: false
source_datasets: ['irds/beir_dbpedia-entity']
task_categories:
- text-retrieval
---
# Dataset Card for `beir/dbpedia-entity/test`
The `beir/dbpedia-entity/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/dbpedia-entity/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=400
- `qrels`: (relevance assessments); count=43,515
- For `docs`, use [`irds/beir_dbpedia-entity`](https://huggingface.co/datasets/irds/beir_dbpedia-entity)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/beir_dbpedia-entity_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_dbpedia-entity_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Hasibi2017DBpediaEntityVA,
title={DBpedia-Entity v2: A Test Collection for Entity Search},
author={Faegheh Hasibi and Fedor Nikolaev and Chenyan Xiong and K. Balog and S. E. Bratsberg and Alexander Kotov and J. Callan},
journal={Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2017}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
ozz/turkishReviews-ds-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 134598991.2416305
num_examples: 362520
- name: validation
num_bytes: 14955814.758369517
num_examples: 40281
download_size: 95987466
dataset_size: 149554806.0
---
# Dataset Card for "turkishReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnabolAndi/Models_test | ---
license: other
---
|
chansung/auto-paper-qa-test | ---
dataset_info:
features:
- name: title
dtype: string
- name: summary
dtype: string
- name: abstract
dtype: string
- name: authors
dtype: string
- name: arxiv_id
dtype: string
- name: 0_question
dtype: string
- name: 0_answers:eli5
dtype: string
- name: 0_answers:expert
dtype: string
- name: 0_additional_depth_q:follow up question
dtype: string
- name: 0_additional_depth_q:answers:eli5
dtype: string
- name: 0_additional_depth_q:answers:expert
dtype: string
- name: 0_additional_breath_q:follow up question
dtype: string
- name: 0_additional_breath_q:answers:eli5
dtype: string
- name: 0_additional_breath_q:answers:expert
dtype: string
- name: 1_question
dtype: string
- name: 1_answers:eli5
dtype: string
- name: 1_answers:expert
dtype: string
- name: 1_additional_depth_q:follow up question
dtype: string
- name: 1_additional_depth_q:answers:eli5
dtype: string
- name: 1_additional_depth_q:answers:expert
dtype: string
- name: 1_additional_breath_q:follow up question
dtype: string
- name: 1_additional_breath_q:answers:eli5
dtype: string
- name: 1_additional_breath_q:answers:expert
dtype: string
- name: 2_question
dtype: string
- name: 2_answers:eli5
dtype: string
- name: 2_answers:expert
dtype: string
- name: 2_additional_depth_q:follow up question
dtype: string
- name: 2_additional_depth_q:answers:eli5
dtype: string
- name: 2_additional_depth_q:answers:expert
dtype: string
- name: 2_additional_breath_q:follow up question
dtype: string
- name: 2_additional_breath_q:answers:eli5
dtype: string
- name: 2_additional_breath_q:answers:expert
dtype: string
- name: 3_question
dtype: string
- name: 3_answers:eli5
dtype: string
- name: 3_answers:expert
dtype: string
- name: 3_additional_depth_q:follow up question
dtype: string
- name: 3_additional_depth_q:answers:eli5
dtype: string
- name: 3_additional_depth_q:answers:expert
dtype: string
- name: 3_additional_breath_q:follow up question
dtype: string
- name: 3_additional_breath_q:answers:eli5
dtype: string
- name: 3_additional_breath_q:answers:expert
dtype: string
- name: target_date
dtype: timestamp[s]
- name: 4_question
dtype: string
- name: 4_answers:eli5
dtype: string
- name: 4_answers:expert
dtype: string
- name: 4_additional_depth_q:follow up question
dtype: string
- name: 4_additional_depth_q:answers:eli5
dtype: string
- name: 4_additional_depth_q:answers:expert
dtype: string
- name: 4_additional_breath_q:follow up question
dtype: string
- name: 4_additional_breath_q:answers:eli5
dtype: string
- name: 4_additional_breath_q:answers:expert
dtype: string
- name: 5_question
dtype: string
- name: 5_answers:eli5
dtype: string
- name: 5_answers:expert
dtype: string
- name: 5_additional_depth_q:follow up question
dtype: string
- name: 5_additional_depth_q:answers:eli5
dtype: string
- name: 5_additional_depth_q:answers:expert
dtype: string
- name: 5_additional_breath_q:follow up question
dtype: string
- name: 5_additional_breath_q:answers:eli5
dtype: string
- name: 5_additional_breath_q:answers:expert
dtype: string
- name: 6_question
dtype: string
- name: 6_answers:eli5
dtype: string
- name: 6_answers:expert
dtype: string
- name: 6_additional_depth_q:follow up question
dtype: string
- name: 6_additional_depth_q:answers:eli5
dtype: string
- name: 6_additional_depth_q:answers:expert
dtype: string
- name: 6_additional_breath_q:follow up question
dtype: string
- name: 6_additional_breath_q:answers:eli5
dtype: string
- name: 6_additional_breath_q:answers:expert
dtype: string
- name: 7_question
dtype: string
- name: 7_answers:eli5
dtype: string
- name: 7_answers:expert
dtype: string
- name: 7_additional_depth_q:follow up question
dtype: string
- name: 7_additional_depth_q:answers:eli5
dtype: string
- name: 7_additional_depth_q:answers:expert
dtype: string
- name: 7_additional_breath_q:follow up question
dtype: string
- name: 7_additional_breath_q:answers:eli5
dtype: string
- name: 7_additional_breath_q:answers:expert
dtype: string
- name: 8_question
dtype: string
- name: 8_answers:eli5
dtype: string
- name: 8_answers:expert
dtype: string
- name: 8_additional_depth_q:follow up question
dtype: string
- name: 8_additional_depth_q:answers:eli5
dtype: string
- name: 8_additional_depth_q:answers:expert
dtype: string
- name: 8_additional_breath_q:follow up question
dtype: string
- name: 8_additional_breath_q:answers:eli5
dtype: string
- name: 8_additional_breath_q:answers:expert
dtype: string
- name: 9_question
dtype: string
- name: 9_answers:eli5
dtype: string
- name: 9_answers:expert
dtype: string
- name: 9_additional_depth_q:follow up question
dtype: string
- name: 9_additional_depth_q:answers:eli5
dtype: string
- name: 9_additional_depth_q:answers:expert
dtype: string
- name: 9_additional_breath_q:follow up question
dtype: string
- name: 9_additional_breath_q:answers:eli5
dtype: string
- name: 9_additional_breath_q:answers:expert
dtype: string
splits:
- name: train
num_bytes: 51368
num_examples: 3
download_size: 258271
dataset_size: 51368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DeepFoldProtein/openfold_msa_contrastive_cards_000_processed_1024_ankh | ---
dataset_info:
features:
- name: query_accession
sequence: string
- name: excludes
sequence:
sequence: string
- name: query_sequence
sequence: string
- name: target_accessions
sequence:
sequence: string
- name: target_sequences
sequence:
sequence: string
- name: input_ids
sequence:
sequence:
sequence: int64
- name: attention_mask
sequence:
sequence:
sequence: int64
- name: special_tokens_mask
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 71626433715
num_examples: 64937
download_size: 2140946171
dataset_size: 71626433715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
danjacobellis/food101_cascade | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: compressed_image
dtype: binary
splits:
- name: train
num_bytes: 176984876
num_examples: 75747
- name: validation
num_bytes: 59015433
num_examples: 25250
download_size: 249884687
dataset_size: 236000309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
minimario/apps_partial_sorted_300_350 | ---
dataset_info:
features:
- name: problem
dtype: string
- name: code
dtype: string
- name: label
dtype: int64
- name: full_sample
dtype: string
- name: where_from
dtype: string
splits:
- name: train
num_bytes: 25695567
num_examples: 20543
download_size: 923285
dataset_size: 25695567
---
# Dataset Card for "apps_partial_sorted_300_350"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
james-burton/OrientalMuseum_min4-white-mat | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: label
dtype:
class_label:
names:
'0': Actinolite
'1': Aluminium bronze alloy
'2': Animal Mummy
'3': Batik
'4': Buffalo Horn
'5': Chinese Red Rosewood
'6': Colour on Paper
'7': Flint/Chert
'8': Gouache on Paper
'9': Haematite/Red Ochre
'10': Human Bone
'11': Ink and Colour on Paper
'12': Ink and Colours on Silk
'13': Ink and Opaque Watercolour on Paper
'14': Ink on Paper
'15': Jade (Calcified)
'16': Japanese paper
'17': Microcline/Green Feldspar/Amazon-Stone
'18': Nile Mud
'19': Opaque Watercolour and Gilt on Paper
'20': Opaque Watercolour on Paper
'21': Opaque Watercolour or Gouache on Mica
'22': Pith
'23': Pith Paper
'24': Plant Product
'25': Resin/Plastic
'26': Rhinoceros Horn
'27': Shell (Ostrich Egg)
'28': Smaragdite
'29': Steatite
'30': Steatite/Soap Stone
'31': Watercolour on Rice Paper
'32': acrylic
'33': agate
'34': alabaster
'35': aluminum
'36': amber
'37': amethyst
'38': antler
'39': artificial stone
'40': balsa
'41': bamboo
'42': basalt
'43': bone
'44': bowenite
'45': boxwood
'46': brass
'47': brocade
'48': bronze
'49': burnt jade
'50': canvas
'51': cardboard
'52': cards
'53': carnelian
'54': cast iron
'55': celadon
'56': cellulose acetate
'57': ceramic
'58': chalcedony
'59': cherry
'60': clay
'61': cloth
'62': coconut
'63': copper
'64': copper alloy
'65': coral
'66': cotton
'67': crystal
'68': diorite
'69': dolerite
'70': earthenware
'71': ebony
'72': emerald
'73': enamel
'74': faience
'75': felt
'76': flax
'77': flint
'78': gauze
'79': glass
'80': gold
'81': granite
'82': gray ware
'83': hardwood
'84': horn
'85': incense
'86': ink
'87': iron
'88': ivory
'89': jade
'90': jadeite
'91': jasper
'92': lacquer
'93': lapis lazuli
'94': lazurite
'95': lead
'96': lead alloy
'97': leather
'98': limestone
'99': linen
'100': malachite
'101': marble
'102': metal
'103': mineral
'104': mother of pearl
'105': muslin
'106': nephrite
'107': nylon
'108': obsidian
'109': organic material
'110': organza
'111': paint
'112': palm fiber
'113': palm leaf
'114': paper
'115': papier mâché
'116': papyrus
'117': pewter
'118': photographic paper
'119': pine
'120': plant fiber
'121': plaster
'122': plastic
'123': plate
'124': polyester
'125': polystyrene
'126': porcelain
'127': pottery
'128': quartzite
'129': rattan
'130': realgar
'131': reed
'132': rice paper
'133': rock
'134': rush
'135': sandstone
'136': satin
'137': schist
'138': seashell
'139': serpentine
'140': shagreen
'141': shell
'142': silk
'143': siltstone
'144': silver
'145': silver alloy
'146': skull
'147': slate
'148': soapstone
'149': softwood
'150': stalagmites
'151': steel
'152': stone
'153': stoneware
'154': straw
'155': stucco
'156': sycamore
'157': synthetic fiber
'158': teak
'159': terracotta
'160': textiles
'161': tin
'162': tortoise shell
'163': tourmaline
'164': travertine
'165': tremolite
'166': turquoise
'167': velvet
'168': wood
'169': wool
'170': wrought iron
'171': zinc alloy
- name: production.period
dtype: string
- name: production.place
dtype: string
- name: new_root
dtype: string
splits:
- name: train
num_bytes: 646369556.254
num_examples: 23083
- name: validation
num_bytes: 182535306.672
num_examples: 5432
- name: test
num_bytes: 166408429.856
num_examples: 5432
download_size: 951074067
dataset_size: 995313292.7819998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_19 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1433739164.0
num_examples: 281567
download_size: 1460267942
dataset_size: 1433739164.0
---
# Dataset Card for "chunk_19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_4_loca_4 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 15
num_examples: 1
download_size: 0
dataset_size: 15
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_4_loca_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gg-ai/es-0103-stop-no-demoji-no-hasthag-l | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: clean_text
dtype: string
- name: sent
dtype: int64
splits:
- name: train
num_bytes: 10604987
num_examples: 28854
- name: test
num_bytes: 2174343
num_examples: 6131
- name: val
num_bytes: 370030
num_examples: 1082
download_size: 8223549
dataset_size: 13149360
---
# Dataset Card for "es-0103-stop-no-demoji-no-hasthag-l"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16 | ---
pretty_name: Evaluation run of TheBloke/Kimiko-v2-13B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Kimiko-v2-13B-fp16](https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T17:23:39.395223](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16/blob/main/results_2023-10-22T17-23-39.395223.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460388544,\n \"f1\": 0.06393351510067083,\n\
\ \"f1_stderr\": 0.001389281752742565,\n \"acc\": 0.44652528493459387,\n\
\ \"acc_stderr\": 0.01048837556583878\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388544,\n\
\ \"f1\": 0.06393351510067083,\n \"f1_stderr\": 0.001389281752742565\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12509476876421532,\n \
\ \"acc_stderr\": 0.009112601439849618\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.01186414969182794\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T17_23_39.395223
path:
- '**/details_harness|drop|3_2023-10-22T17-23-39.395223.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T17-23-39.395223.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T17_23_39.395223
path:
- '**/details_harness|gsm8k|5_2023-10-22T17-23-39.395223.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T17-23-39.395223.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T17_23_39.395223
path:
- '**/details_harness|winogrande|5_2023-10-22T17-23-39.395223.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T17-23-39.395223.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- results_2023-08-31T10:23:07.841871.parquet
- split: 2023_10_22T17_23_39.395223
path:
- results_2023-10-22T17-23-39.395223.parquet
- split: latest
path:
- results_2023-10-22T17-23-39.395223.parquet
---
# Dataset Card for Evaluation run of TheBloke/Kimiko-v2-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Kimiko-v2-13B-fp16](https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T17:23:39.395223](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16/blob/main/results_2023-10-22T17-23-39.395223.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06393351510067083,
"f1_stderr": 0.001389281752742565,
"acc": 0.44652528493459387,
"acc_stderr": 0.01048837556583878
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06393351510067083,
"f1_stderr": 0.001389281752742565
},
"harness|gsm8k|5": {
"acc": 0.12509476876421532,
"acc_stderr": 0.009112601439849618
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.01186414969182794
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0 | ---
pretty_name: Evaluation run of lizhuang144/llama_mirror_13b_v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lizhuang144/llama_mirror_13b_v1.0](https://huggingface.co/lizhuang144/llama_mirror_13b_v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T10:53:19.904763](https://huggingface.co/datasets/open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0/blob/main/results_2023-09-17T10-53-19.904763.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415926,\n \"f1\": 0.06866086409395994,\n\
\ \"f1_stderr\": 0.0014864813602608763,\n \"acc\": 0.42033799014225337,\n\
\ \"acc_stderr\": 0.00955801932501455\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415926,\n\
\ \"f1\": 0.06866086409395994,\n \"f1_stderr\": 0.0014864813602608763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \
\ \"acc_stderr\": 0.007223844172845574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lizhuang144/llama_mirror_13b_v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|arc:challenge|25_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T10_53_19.904763
path:
- '**/details_harness|drop|3_2023-09-17T10-53-19.904763.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T10-53-19.904763.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T10_53_19.904763
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-53-19.904763.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T10-53-19.904763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hellaswag|10_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:32.921683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T16:49:32.921683.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T16:49:32.921683.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T10_53_19.904763
path:
- '**/details_harness|winogrande|5_2023-09-17T10-53-19.904763.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T10-53-19.904763.parquet'
- config_name: results
data_files:
- split: 2023_08_09T16_49_32.921683
path:
- results_2023-08-09T16:49:32.921683.parquet
- split: 2023_09_17T10_53_19.904763
path:
- results_2023-09-17T10-53-19.904763.parquet
- split: latest
path:
- results_2023-09-17T10-53-19.904763.parquet
---
# Dataset Card for Evaluation run of lizhuang144/llama_mirror_13b_v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lizhuang144/llama_mirror_13b_v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lizhuang144/llama_mirror_13b_v1.0](https://huggingface.co/lizhuang144/llama_mirror_13b_v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T10:53:19.904763](https://huggingface.co/datasets/open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0/blob/main/results_2023-09-17T10-53-19.904763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415926,
"f1": 0.06866086409395994,
"f1_stderr": 0.0014864813602608763,
"acc": 0.42033799014225337,
"acc_stderr": 0.00955801932501455
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415926,
"f1": 0.06866086409395994,
"f1_stderr": 0.0014864813602608763
},
"harness|gsm8k|5": {
"acc": 0.07429871114480667,
"acc_stderr": 0.007223844172845574
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ckail/Needy_Girl_Overdose_P | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_mnli_verbal_ing_suffix | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1961229
num_examples: 9527
- name: dev_mismatched
num_bytes: 2066574
num_examples: 9535
- name: test_matched
num_bytes: 1948471
num_examples: 9403
- name: test_mismatched
num_bytes: 2069435
num_examples: 9573
- name: train
num_bytes: 78004343
num_examples: 373005
download_size: 55482068
dataset_size: 86050052
---
# Dataset Card for "MULTI_VALUE_mnli_verbal_ing_suffix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Julianasm/minhavozValmir | ---
license: openrail
---
|
nattiey1/diverse-unit-QA | ---
task_categories:
- question-answering
size_categories:
- 100K<n<1M
---
# Dataset Card for DUQA
## Table of Contents
- [Dataset Description](#dataset-description)
* [Abstract](#abstract)
* [Languages](#languages)
- [Dataset Structure](#dataset-structure)
* [Data Instances](#data-instances)
* [Data Fields](#data-fields)
- [Data Statistics](#data-statistics)
- [Dataset Creation](#dataset-creation)
* [Curation Rationale](#curation-rationale)
* [Source Data](#source-data)
* [Annotations](#annotations)
- [Considerations for Using the Data](#considerations-for-using-the-data)
* [Discussion of Social Impact and Biases](#discussion-of-social-impact-and-biases)
* [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
* [Dataset Curators](#dataset-curators)
* [Licensing Information](#licensing-information)
* [Citation Information](#citation-information)
## Dataset Description
### Abstract
DUQA is a dataset for single-step unit conversion questions. It comes in three sizes, ”DUQA10k”, ”DUQA100k” and ”DUQA1M”, with 10,000, 100,000 and 1,000,000 entries respectively. Each size contains a mixture of basic and complex conversion questions, including simple conversion, multiple answer, max/min, argmax/argmin, and noisy/q-noisy questions. The complexity level varies based on the amount of information present in the sentence and the number of reasoning steps required to calculate a correct answer.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
A single instance in the dataset consists of a question related to a single-step unit conversion problem, along with its corresponding correct answer.
### Data Fields
The dataset contains fields for the question, answer, and additional context about the question, along with multiple choices answers.
## Data Statistics
The dataset comes in three sizes, with 10,000, 100,000 and 1,000,000 entries respectively.
## Dataset Creation
### Curation Rationale
The dataset is curated to help machine learning models understand and perform single-step unit conversions. This ability is essential for many real-world applications, including but not limited to physical sciences, engineering, and data analysis tasks.
### Source Data
The source data for the dataset is generated using a Python library provided with the dataset, which can create new datasets from a list of templates.
### Annotations
The dataset does not contain any annotations.
## Considerations for Using the Data
### Discussion of Social Impact and Biases
The dataset is neutral and does not contain any explicit biases or social implications as it deals primarily with mathematical conversion problems.
### Other Known Limitations
The complexity of the questions is limited to single-step unit conversions. It does not cover multi-step or more complex unit conversion problems.
## Additional Information
### Dataset Curators
The dataset was created by a team of researchers. More information might be needed to provide specific names or organizations.
### Licensing Information
The licensing information for this dataset is not provided. Please consult the dataset provider for more details.
### Citation Information
The citation information for this dataset is not provided. Please consult the dataset provider for more details.
|
ResplendentAI/NSFW_Format_Test | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
pretty_name: NSFW Format Test
--- |
liuyanchen1015/MULTI_VALUE_sst2_aint_be | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 5299
num_examples: 36
- name: test
num_bytes: 11318
num_examples: 79
- name: train
num_bytes: 148455
num_examples: 1330
download_size: 84818
dataset_size: 165072
---
# Dataset Card for "MULTI_VALUE_sst2_aint_be"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yantitarina34/stable_diffusion | ---
license: apache-2.0
---
|
aignosi/langchaing-docs-chatgpt-plugin | ---
license: apache-2.0
---
|
greenpau/amz-press-release | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- en
pretty_name: Public Amazon Press Release Dataset
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files: 'data.jsonl.zst'
---
# amz-press-release
Public Amazon Press Release Dataset
## Dataset Description
This dataset contains data from Amazon News: http://amazon2022tf.q4web.com/news/default.aspx
## Dataset Structure
Each line in the downloaded data file is a JSON dictionary containing the following data.
```json
{
"headline": "Amazon's Buy with Prime Increases Shopper Conversion by an Average of 25%",
"url": "/news/news-details/2023/Amazons-Buy-with-Prime-Increases-Shopper-Conversion-by-an-Average-of-25/default.aspx",
"seo_name": "Amazons-Buy-with-Prime-Increases-Shopper-Conversion-by-an-Average-of-25",
"id": 4850,
"date": "01/10/2023 08:00:00",
"parsed_headline": "Amazon's Buy with Prime Increases Shopper Conversion by an Average of 25%",
"parsed_date": "01/10/2023",
"parsed_subheading_txt": "Previously available on an invite-only basis ...",
"parsed_subheading_html": "<div><p><i>Previously available on an invite-only basis ... </i></p></div>",
"parsed_body_txt": "SEATTLE--(BUSINESS WIRE)-- \nAmazon today announced that Buy with Prime ...",
"parsed_body_html": "<p>SEATTLE--(BUSINESS WIRE)-- Amazon today announced that Buy with Prime ...</p>"
}
```
### Citation Information
```bibtex
@misc{amz-press-release,
author = {Paul Greenberg},
title = {Public Amazon Press Release Dataset},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\\url{https://huggingface.co/datasets/greenpau/amz-press-release}},
}
``` |
owanr/o1o2o3_xl_r2_coedit | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
sequence: string
splits:
- name: train
num_bytes: 18084526
num_examples: 35807
download_size: 7764285
dataset_size: 18084526
---
# Dataset Card for "o1o2o3_xl_r2_coedit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/unalignment-toxic-dpo-v0.2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: id
dtype: string
- name: system
dtype: string
splits:
- name: train
num_bytes: 3610985
num_examples: 1082
download_size: 1426016
dataset_size: 3610985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hazardous/har | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': calling
'1': clapping
'2': cycling
'3': dancing
'4': drinking
'5': eating
'6': fighting
'7': hugging
'8': laughing
'9': listening_to_music
'10': running
'11': sitting
'12': sleeping
'13': texting
'14': using_laptop
splits:
- name: train
num_bytes: 208908112.2
num_examples: 12600
download_size: 227817680
dataset_size: 208908112.2
---
# Dataset Card for "har"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tippawan/en-th-full-abstract-pairs | ---
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: th
dtype: string
splits:
- name: train
num_bytes: 13578996
num_examples: 4998
- name: validation
num_bytes: 2518289
num_examples: 624
- name: test
num_bytes: 2783734
num_examples: 624
download_size: 7114435
dataset_size: 18881019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
hotchpotch/ms_marco_japanese | ---
language:
- ja
license: other
license_name: same-ms-marco
license_link: https://huggingface.co/datasets/ms_marco
dataset_info:
config_name: v2.1-madlad400-3b
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
splits:
- name: validation
num_bytes: 440690468
num_examples: 101093
- name: train
num_bytes: 3590508080
num_examples: 808731
- name: test
num_bytes: 430765349
num_examples: 101092
download_size: 2491144245
dataset_size: 4461963897
configs:
- config_name: v2.1-madlad400-3b
data_files:
- split: validation
path: v2.1-madlad400-3b/validation-*
- split: train
path: v2.1-madlad400-3b/train-*
- split: test
path: v2.1-madlad400-3b/test-*
---
# ms_marco_japanese
- [ms_marco](https://huggingface.co/datasets/ms_marco) の日本語翻訳データです。
- 翻訳には、[google/madlad400-3b-mt](https://huggingface.co/google/madlad400-3b-mt)を利用しています。
- HuggingFace で公開されている、ms_marco と同等の構造で保存しています。
- 翻訳品質はそれほど高くありません。繁体字などが含まれるデータもあります。Google Translate API を用いて翻訳された、マルチリンガルms_marcoデータセットである、[mMARCO](https://github.com/unicamp-dl/mMARCO)の方が品質が高いです。そのため、このデータセットを利用の際は、他の翻訳データセットとの比較をお勧めします。
- wellFormedAnswers カラムは翻訳していません
- 翻訳にかかった時間は、高速化のため[santhosh/madlad400-3b-ct2](https://huggingface.co/santhosh/madlad400-3b-ct2)を利用し、対象のデータ約1000万文に対して RTX3090 で8日ほどでした。
## 利用方法
```
from datasets import load_dataset
train_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="train")
validation_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="validation")
test_ds = load_dataset("hotchpotch/ms_marco_japanese", "v2.1-madlad400-3b", split="test"
```
```
print(train_ds[0])
{'answers': ['マンハッタン計画の成功が直接的にもたらした影響は、原子力研究者や技術員達による素晴しい業績を覆い隠す唯一な雲であった。その成果と真実であるもの:何十万という無辜なる命々があきれていたことだろうか?'], 'passages': {'is_selected': [1, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'passage_text': ['科学者の間でコミュニケーションが行われることは、マンハッタン計画を成功させるために重要であった。原子力研究家や技術員たちによって達成された素晴らしい業績には雲だけがあふれているものだろうか?その実際的な意味と言えば何十万という無辜なる人々へ生命も犠牲になっていることですね!', 'マンハッタン計画とその原子爆弾は第二次世界大戦の終結に寄与し、平和的な目標をもって核エネルギーが利用されたことで歴史や科学界には影響力があった。', 'マンハッタン計画は原子爆弾の製造が可能かどうかなんて見るために始められた。このプロジェクトを成功させれば、世界には永遠な変化がありそこまで強力で人工的であることも知らしむことになっただろいますからね.', 'マンハッタン計画(Manhattan Project)は、第二次世界大戦中にアメリカ合衆国で行われた原子爆弾開発プロジェクトの名称。特別には1942年から翌日までレスリー・R. グローブズ将軍が指揮する米陸军工兵隊によって実施されたものをいうことが多かったのである 。', 'また、各巻のバージョンと補完的なウェブサイトもある。最初に作られたのは『マンハッタン計画: インタラクティヴ・ヒストリー』であり([http://www.cfo-doe/me70_history)歴史遺産資源局および国家核安全保障庁によるものだったが現在では全て廃止されています(https//en](http://www.cfo-doe/me70_history)%E6%AD%B4%E5%8F%B2%E9%81%BA%E7%94%A3%E8%B3%87%E6%BA%90%E5%B1%80%E3%81%8A%E3%82%88%E3%81%B3%E5%9B%BD%E5%AE%B6%E6%A0%B8%E5%AE%89%E5%85%A8%E4%BF%9D%E9%9A%9C%E5%BA%81%E3%81%AB%E3%82%88%E3%82%8B%E3%82%82%E3%81%AE%E3%81%A0%E3%81%A3%E3%81%9F%E3%81%8C%E7%8F%BE%E5%9C%A8%E3%81%A7%E3%81%AF%E5%85%A8%E3%81%A6%E5%BB%83%E6%AD%A2%E3%81%95%E3%82%8C%E3%81%A6%E3%81%84%E3%81%BE%E3%81%99(https//en))', '原子爆弾は、1945年7月にニューメキシコ州の砂漠で初めて実験的な核兵器として使用された。その後も多くが開発され続けたものだったのである(マンハッタン計画)。', 'また、原爆や第二次世界大戦の終結に関する非常によく豊富な文献を置き換える試みもない。本コレクションはマンハッタン計画について起源と発展が記録されることには努めていませんのである 。', 'マンハッタン計画(Manhattan Project)は、第二次世界大戦中に最初の核兵器を生産した研究開発事業である。イギリスとカナダによる支援下アメリカ合衆国が主導していたものだった 。1942年から1946年代までこのプロジェクトには米陸軍工廠少将レスリー・グローブス (Leslie Groves) (英語版 )(en:Lesley G.Grove, US Army Corp of Engineer), ロサンゼル斯原子力実験場所長ロバート·オペンハーマーらも参加しており,その間爆弾設計者として活躍していることでも知られていたのであり ,また彼等自身について言及する必要性があると考えている人物であることなどよりこれ以上詳細な情報ではないかという意見がありました', '1942年6月、アメリカ陸軍工兵隊はマンハッタン計画を開始した。原子爆弾の秘密名称であるが.', 'マンハッタン計画のB炉がハンフォードに建設される理由は、北アメリカ沿岸から太平洋へ流れ込む最大級河川であるコロンビア湖と近いことだった。'], 'url': ['[http://www.pitt.edu/~sdb14/atombomb.html](http://www.pitt.edu/~sdb14/atombomb.html)', '[http://www.osti.gov/accomplishments/manhattan_story.html](http://www.osti.gov/accomplishments/manhattan_story.html)', '[http://www.123helpme.com/impact-of-the-manhattan-project-preview.asp?id=177337](http://www.123helpme.com/impact-of-the-manhattan-project-preview.asp?id=177337)', '[http://www.answers.com/Q/How_did_the_Manhattan_Project_impact_on_society](http://www.answers.com/Q/How_did_the_Manhattan_Project_impact_on_society)', '[https://www.osti.gov/manhattan-project-history/publications/Manhattan_Project_2010.pdf](https://www.osti.gov/manhattan-project-history/publications/Manhattan_Project_2010.pdf)', '[http://www.ushistory.org/us/51f.asp](http://www.ushistory.org/us/51f.asp)', '[http://nsarchive.gwu.edu/NSAEBB/NSAEBB162](http://nsarchive.gwu.edu/NSAEBB/NSAEBB162)', '[https://en.wikipedia.org/wiki/Manhattan_Project](https://en.wikipedia.org/wiki/Manhattan_Project)', '[https://quizlet.com/41456230/a-bomb-flash-cards/](https://quizlet.com/41456230/a-bomb-flash-cards/)', '[https://www.atomicheritage.org/history/environmental-consequences](https://www.atomicheritage.org/history/environmental-consequences)']}, 'query': '(マンハッタン計画の成功が直接的にもたらした影響は何でしょうか。', 'query_id': 1185869, 'query_type': 'DESCRIPTION', 'wellFormedAnswers': []}
```
## ライセンス
- ms_marco と同等とします。 |
Plachta/GLIP-test-images | ---
license: apache-2.0
---
|
gagan3012/toxicRMv2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: candidates
list:
- name: decoding_method
dtype: string
- name: model
dtype: string
- name: scores
struct:
- name: pairrm
dtype: float64
- name: safety
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 50284677.0707871
num_examples: 57103
- name: test
num_bytes: 508103.9292128988
num_examples: 577
download_size: 23537168
dataset_size: 50792781.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
GilsonRDF/Teste | ---
dataset_info:
features:
- name: conversation
dtype: string
splits:
- name: train
num_bytes: 5450.4
num_examples: 24
- name: test
num_bytes: 1362.6
num_examples: 6
download_size: 6033
dataset_size: 6813.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
David-Xu/astronomy-stack-dpo-text | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 53775582
num_examples: 17942
- name: test
num_bytes: 6055646
num_examples: 1993
download_size: 17732927
dataset_size: 59831228
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ascolda/ru_en_Crystallography_and_Spectroscopy | ---
task_categories:
- translation
language:
- ru
- en
size_categories:
- 10K<n<100K
tags:
- chemistry
--- |
yuanmei424/xxt_en | ---
dataset_info:
features:
- name: edit_prompt
dtype: string
- name: input_image
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 5329195147.25
num_examples: 2283951
download_size: 526250170
dataset_size: 5329195147.25
---
# Dataset Card for "xxt_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/wikiclir_tl | ---
pretty_name: '`wikiclir/tl`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/tl`
The `wikiclir/tl` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/tl).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=79,008
- `queries` (i.e., topics); count=48,930
- `qrels`: (relevance assessments); count=72,359
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_tl', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_tl', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_tl', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
medric49/dolly-rag-gpt4-ins | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 7282892
num_examples: 4467
download_size: 4479160
dataset_size: 7282892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dolly-rag-gpt4-ins"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/3db56ea8 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1331
dataset_size: 178
---
# Dataset Card for "3db56ea8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_present_modals | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3384
num_examples: 15
- name: test
num_bytes: 10975
num_examples: 39
- name: train
num_bytes: 20618
num_examples: 107
download_size: 20617
dataset_size: 34977
---
# Dataset Card for "MULTI_VALUE_wnli_present_modals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
plncmm/wl-body-part | ---
license: cc-by-nc-4.0
---
|
nikraf/uniref128-256AA | ---
dataset_info:
features:
- name: seqs
dtype: string
splits:
- name: train
num_bytes: 92114464
num_examples: 487832
download_size: 92373850
dataset_size: 92114464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jsfs11__SnorkelWestBeagle-DARETIES-7B | ---
pretty_name: Evaluation run of jsfs11/SnorkelWestBeagle-DARETIES-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/SnorkelWestBeagle-DARETIES-7B](https://huggingface.co/jsfs11/SnorkelWestBeagle-DARETIES-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__SnorkelWestBeagle-DARETIES-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T09:47:01.298299](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__SnorkelWestBeagle-DARETIES-7B/blob/main/results_2024-01-25T09-47-01.298299.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479799815457161,\n\
\ \"acc_stderr\": 0.032161535797548865,\n \"acc_norm\": 0.6485286656946667,\n\
\ \"acc_norm_stderr\": 0.03282087535821274,\n \"mc1\": 0.5630354957160343,\n\
\ \"mc1_stderr\": 0.017363844503195953,\n \"mc2\": 0.7005107732516146,\n\
\ \"mc2_stderr\": 0.014999534657573073\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.01358257109581529,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428173\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.711611232822147,\n\
\ \"acc_stderr\": 0.004520870679457037,\n \"acc_norm\": 0.8735311690898228,\n\
\ \"acc_norm_stderr\": 0.0033169770861701505\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n\
\ \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n\
\ \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n\
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.01662399851333311,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.01662399851333311\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900808,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900808\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5630354957160343,\n\
\ \"mc1_stderr\": 0.017363844503195953,\n \"mc2\": 0.7005107732516146,\n\
\ \"mc2_stderr\": 0.014999534657573073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166742\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \
\ \"acc_stderr\": 0.01336363029508836\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/SnorkelWestBeagle-DARETIES-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|arc:challenge|25_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|gsm8k|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hellaswag|10_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T09-47-01.298299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T09-47-01.298299.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- '**/details_harness|winogrande|5_2024-01-25T09-47-01.298299.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T09-47-01.298299.parquet'
- config_name: results
data_files:
- split: 2024_01_25T09_47_01.298299
path:
- results_2024-01-25T09-47-01.298299.parquet
- split: latest
path:
- results_2024-01-25T09-47-01.298299.parquet
---
# Dataset Card for Evaluation run of jsfs11/SnorkelWestBeagle-DARETIES-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/SnorkelWestBeagle-DARETIES-7B](https://huggingface.co/jsfs11/SnorkelWestBeagle-DARETIES-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__SnorkelWestBeagle-DARETIES-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T09:47:01.298299](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__SnorkelWestBeagle-DARETIES-7B/blob/main/results_2024-01-25T09-47-01.298299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479799815457161,
"acc_stderr": 0.032161535797548865,
"acc_norm": 0.6485286656946667,
"acc_norm_stderr": 0.03282087535821274,
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195953,
"mc2": 0.7005107732516146,
"mc2_stderr": 0.014999534657573073
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.01358257109581529,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428173
},
"harness|hellaswag|10": {
"acc": 0.711611232822147,
"acc_stderr": 0.004520870679457037,
"acc_norm": 0.8735311690898228,
"acc_norm_stderr": 0.0033169770861701505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786744,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.01662399851333311,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.01662399851333311
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900808,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900808
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195953,
"mc2": 0.7005107732516146,
"mc2_stderr": 0.014999534657573073
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166742
},
"harness|gsm8k|5": {
"acc": 0.6209249431387415,
"acc_stderr": 0.01336363029508836
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
manu/english-60b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 259969046699
num_examples: 58986336
- name: test
num_bytes: 43278365
num_examples: 10000
download_size: 151705709032
dataset_size: 260012325064
---
# Dataset Card for "english_20b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SportsShot/SportsShot | ---
license: cc-by-nc-4.0
---
|
Falah/random_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 27245594
num_examples: 100000
download_size: 4512640
dataset_size: 27245594
---
# Dataset Card for "random_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sakurakouji_kinako_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sakurakouji_kinako/桜小路きな子/사쿠라코지키나코 (Love Live! Superstar!!)
This is the dataset of sakurakouji_kinako/桜小路きな子/사쿠라코지키나코 (Love Live! Superstar!!), containing 179 images and their tags.
The core tags of this character are `bangs, brown_hair, long_hair, green_eyes, twintails, low_twintails, braid, blunt_bangs, ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 179 | 238.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakurakouji_kinako_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 179 | 117.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakurakouji_kinako_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 415 | 258.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakurakouji_kinako_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 179 | 201.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakurakouji_kinako_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 415 | 400.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakurakouji_kinako_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sakurakouji_kinako_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blue_jacket, grey_dress, long_sleeves, looking_at_viewer, neck_ribbon, solo, yuigaoka_school_uniform, smile, black_pantyhose, open_mouth, red_ribbon, blush, pinafore_dress, brown_footwear, full_body, loafers, collared_shirt, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, blue_jacket, blush, grey_dress, long_sleeves, looking_at_viewer, open_jacket, solo, yuigaoka_school_uniform, neck_ribbon, pinafore_dress, red_ribbon, white_background, black_pantyhose, petals, smile, white_shirt, closed_mouth, collared_shirt, french_braid, hair_ribbon, simple_background, upper_body |
| 2 | 7 |  |  |  |  |  | 1girl, beret, looking_at_viewer, solo, blue_headwear, short_sleeves, smile, birthday, dress, jacket, blush, collarbone, open_mouth, pink_gloves, white_background |
| 3 | 11 |  |  |  |  |  | 1girl, solo, fingerless_gloves, looking_at_viewer, smile, white_gloves, sleeveless, blush, open_mouth, arm_up, armpits, bow, clothes_around_waist, skirt, confetti, medium_breasts, green_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_jacket | grey_dress | long_sleeves | looking_at_viewer | neck_ribbon | solo | yuigaoka_school_uniform | smile | black_pantyhose | open_mouth | red_ribbon | blush | pinafore_dress | brown_footwear | full_body | loafers | collared_shirt | white_background | open_jacket | petals | white_shirt | closed_mouth | french_braid | hair_ribbon | simple_background | upper_body | beret | blue_headwear | short_sleeves | birthday | dress | jacket | collarbone | pink_gloves | fingerless_gloves | white_gloves | sleeveless | arm_up | armpits | bow | clothes_around_waist | skirt | confetti | medium_breasts | green_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:---------------|:--------------------|:--------------|:-------|:--------------------------|:--------|:------------------|:-------------|:-------------|:--------|:-----------------|:-----------------|:------------|:----------|:-----------------|:-------------------|:--------------|:---------|:--------------|:---------------|:---------------|:--------------|:--------------------|:-------------|:--------|:----------------|:----------------|:-----------|:--------|:---------|:-------------|:--------------|:--------------------|:---------------|:-------------|:---------|:----------|:------|:-----------------------|:--------|:-----------|:-----------------|:---------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | X | | X | | X | | X | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | X | | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/aihara_yukino_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aihara_yukino/相原雪乃 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of aihara_yukino/相原雪乃 (THE iDOLM@STER: Cinderella Girls), containing 28 images and their tags.
The core tags of this character are `brown_hair, long_hair, braid, brown_eyes, single_braid, very_long_hair, breasts, hat, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 20.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aihara_yukino_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 17.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aihara_yukino_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 27.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aihara_yukino_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 20.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aihara_yukino_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 30.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aihara_yukino_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aihara_yukino_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, smile, solo, dress, large_breasts, looking_at_viewer, necklace, cleavage, gloves, hair_bow, sitting, teacup |
| 1 | 10 |  |  |  |  |  | 1girl, smile, solo, card_(medium), character_name, flower_(symbol), open_mouth, dress, gloves, hair_ornament, pink_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | dress | large_breasts | looking_at_viewer | necklace | cleavage | gloves | hair_bow | sitting | teacup | card_(medium) | character_name | flower_(symbol) | open_mouth | hair_ornament | pink_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:----------------|:--------------------|:-----------|:-----------|:---------|:-----------|:----------|:---------|:----------------|:-----------------|:------------------|:-------------|:----------------|:------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | | | | X | | | | X | X | X | X | X | X |
|
daqc/wikihow_es | ---
dataset_info:
features:
- name: title
dtype: string
- name: section_name
dtype: string
- name: summary
dtype: string
- name: document
dtype: string
- name: english_section_name
dtype: string
- name: english_url
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 323465146
num_examples: 113160
download_size: 173101313
dataset_size: 323465146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
codesagar/malicious-llm-prompts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: malicious
dtype: bool
- name: reasoning
dtype: string
- name: attack_type
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2859138
num_examples: 3570
- name: validation
num_bytes: 641063
num_examples: 763
- name: test
num_bytes: 677615
num_examples: 765
download_size: 2405757
dataset_size: 4177816
---
# Dataset Card for "malicious-llm-prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Giacinta/heehe | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: ll
size_categories:
- 1K<n<10K
--- |
CyberHarem/archetto_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of archetto/アルケット/空弦 (Arknights)
This is the dataset of archetto/アルケット/空弦 (Arknights), containing 212 images and their tags.
The core tags of this character are `animal_ears, long_hair, blue_eyes, red_eyes, heterochromia, blonde_hair, breasts, tail, very_long_hair, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 212 | 434.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archetto_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 212 | 360.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archetto_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 567 | 723.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/archetto_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/archetto_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, red_cape, solo, black_gloves, epaulettes, simple_background, tiara, white_background, medium_breasts, blush, cowboy_shot, open_mouth, partially_fingerless_gloves, :d, upper_body |
| 1 | 20 |  |  |  |  |  | 1girl, red_cape, solo, holding_bow_(weapon), arrow_(projectile), black_dress, black_gloves, looking_at_viewer, epaulettes, knee_boots, simple_background, white_background, full_body, medium_breasts, partially_fingerless_gloves, white_footwear, cross-laced_footwear, frilled_dress, tiara, thigh_strap, infection_monitor_(arknights) |
| 2 | 49 |  |  |  |  |  | 1girl, official_alternate_costume, solo, shirt, short_sleeves, looking_at_viewer, hair_bow, white_skirt, white_gloves, midriff, blush, open_mouth, navel, red_bowtie, crop_top, fingerless_gloves, miniskirt, cowboy_shot, infection_monitor_(arknights), parted_bangs, :d, white_background, epaulettes, holding, lion_tail, blue_jacket, microphone, two_side_up |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | looking_at_viewer | red_cape | solo | black_gloves | epaulettes | simple_background | tiara | white_background | medium_breasts | blush | cowboy_shot | open_mouth | partially_fingerless_gloves | :d | upper_body | holding_bow_(weapon) | arrow_(projectile) | knee_boots | full_body | white_footwear | cross-laced_footwear | frilled_dress | thigh_strap | infection_monitor_(arknights) | official_alternate_costume | shirt | short_sleeves | hair_bow | white_skirt | white_gloves | midriff | navel | red_bowtie | crop_top | fingerless_gloves | miniskirt | parted_bangs | holding | lion_tail | blue_jacket | microphone | two_side_up |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:-----------|:-------|:---------------|:-------------|:--------------------|:--------|:-------------------|:-----------------|:--------|:--------------|:-------------|:------------------------------|:-----|:-------------|:-----------------------|:---------------------|:-------------|:------------|:-----------------|:-----------------------|:----------------|:--------------|:--------------------------------|:-----------------------------|:--------|:----------------|:-----------|:--------------|:---------------|:----------|:--------|:-------------|:-----------|:--------------------|:------------|:---------------|:----------|:------------|:--------------|:-------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 49 |  |  |  |  |  | X | | X | | X | | X | | | X | | X | X | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_KoboldAI__fairseq-dense-1.3B | ---
pretty_name: Evaluation run of KoboldAI/fairseq-dense-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/fairseq-dense-1.3B](https://huggingface.co/KoboldAI/fairseq-dense-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__fairseq-dense-1.3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T04:22:19.785222](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__fairseq-dense-1.3B/blob/main/results_2023-10-19T04-22-19.785222.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.024119127516778523,\n\
\ \"em_stderr\": 0.0015711543458424907,\n \"f1\": 0.10603817114093886,\n\
\ \"f1_stderr\": 0.002447898366394225,\n \"acc\": 0.2951854775059195,\n\
\ \"acc_stderr\": 0.006910524554827735\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.024119127516778523,\n \"em_stderr\": 0.0015711543458424907,\n\
\ \"f1\": 0.10603817114093886,\n \"f1_stderr\": 0.002447898366394225\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.590370955011839,\n\
\ \"acc_stderr\": 0.01382104910965547\n }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/fairseq-dense-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T04_22_19.785222
path:
- '**/details_harness|drop|3_2023-10-19T04-22-19.785222.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T04-22-19.785222.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T04_22_19.785222
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-22-19.785222.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-22-19.785222.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T04_22_19.785222
path:
- '**/details_harness|winogrande|5_2023-10-19T04-22-19.785222.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T04-22-19.785222.parquet'
- config_name: results
data_files:
- split: 2023_10_19T04_22_19.785222
path:
- results_2023-10-19T04-22-19.785222.parquet
- split: latest
path:
- results_2023-10-19T04-22-19.785222.parquet
---
# Dataset Card for Evaluation run of KoboldAI/fairseq-dense-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/fairseq-dense-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/fairseq-dense-1.3B](https://huggingface.co/KoboldAI/fairseq-dense-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__fairseq-dense-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T04:22:19.785222](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__fairseq-dense-1.3B/blob/main/results_2023-10-19T04-22-19.785222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.024119127516778523,
"em_stderr": 0.0015711543458424907,
"f1": 0.10603817114093886,
"f1_stderr": 0.002447898366394225,
"acc": 0.2951854775059195,
"acc_stderr": 0.006910524554827735
},
"harness|drop|3": {
"em": 0.024119127516778523,
"em_stderr": 0.0015711543458424907,
"f1": 0.10603817114093886,
"f1_stderr": 0.002447898366394225
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.590370955011839,
"acc_stderr": 0.01382104910965547
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-7b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T16:13:27.845445](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf/blob/main/results_2023-10-14T16-13-27.845445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.05166841442953039,\n\
\ \"f1_stderr\": 0.0012678878311342997,\n \"acc\": 0.36261266786861684,\n\
\ \"acc_stderr\": 0.010449619353516184\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n\
\ \"f1\": 0.05166841442953039,\n \"f1_stderr\": 0.0012678878311342997\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07960576194086429,\n \
\ \"acc_stderr\": 0.00745592433867628\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6456195737963694,\n \"acc_stderr\": 0.013443314368356088\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|arc:challenge|25_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T16_13_27.845445
path:
- '**/details_harness|drop|3_2023-10-14T16-13-27.845445.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T16-13-27.845445.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T16_13_27.845445
path:
- '**/details_harness|gsm8k|5_2023-10-14T16-13-27.845445.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T16-13-27.845445.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hellaswag|10_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T16_13_27.845445
path:
- '**/details_harness|winogrande|5_2023-10-14T16-13-27.845445.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T16-13-27.845445.parquet'
- config_name: results
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- results_2023-08-25T17:04:00.078187.parquet
- split: 2023_08_26T03_58_42.829453
path:
- results_2023-08-26T03:58:42.829453.parquet
- split: 2023_10_14T16_13_27.845445
path:
- results_2023-10-14T16-13-27.845445.parquet
- split: latest
path:
- results_2023-10-14T16-13-27.845445.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-7b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T16:13:27.845445](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf/blob/main/results_2023-10-14T16-13-27.845445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.05166841442953039,
"f1_stderr": 0.0012678878311342997,
"acc": 0.36261266786861684,
"acc_stderr": 0.010449619353516184
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.05166841442953039,
"f1_stderr": 0.0012678878311342997
},
"harness|gsm8k|5": {
"acc": 0.07960576194086429,
"acc_stderr": 0.00745592433867628
},
"harness|winogrande|5": {
"acc": 0.6456195737963694,
"acc_stderr": 0.013443314368356088
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fantasyfish/laion-art | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: aesthetic
dtype: float64
splits:
- name: train
num_bytes: 11640624315.8
num_examples: 20072
- name: test
num_bytes: 538961083.0
num_examples: 855
download_size: 12347056207
dataset_size: 12179585398.8
---
# Dataset Card for "laion-art"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daniilak/Russia_Real_Estate_2018_2021 | ---
license: cc
---
### Context
The dataset consists of lists of unique objects of popular portals for the sale of real estate in Russia. More than 540 thousand objects.
The dataset contains 540000 real estate objects in Russia.
### Content
The Russian real estate market has a relatively short history. In the Soviet era, all properties were state-owned; people only had the right to use them with apartments allocated based on one's place of work. As a result, options for moving were fairly limited. However, after the fall of the Soviet Union, the Russian real estate market emerged and Muscovites could privatize and subsequently sell and buy properties for the first time. Today, Russian real estate is booming. It offers many exciting opportunities and high returns for lifestyle and investment.
The real estate market has been in a growth phase for several years, which means that you can still find properties at very attractive prices, but with good chances of increasing their value in the future.
### Dataset
The dataset has 13 fields.
- date - date of publication of the announcement;
- time - the time when the ad was published;
- geo_lat - Latitude
- geo_lon - Longitude
- region - Region of Russia. There are 85 subjects in the country in total.
- building_type - Facade type. 0 - Other. 1 - Panel. 2 - Monolithic. 3 - Brick. 4 - Blocky. 5 - Wooden
- object_type - Apartment type. 1 - Secondary real estate market; 2 - New building;
- level - Apartment floor
- levels - Number of storeys
- rooms - the number of living rooms. If the value is "-1", then it means "studio apartment"
- area - the total area of the apartment
- kitchen_area - Kitchen area
- price - Price. in rubles
### Attention.
The dataset may contain erroneous data due to input errors on services, as well as outliers, and so on.
### :)
Using this dataset, we offer Kagglers algorithms that use a wide range of functions to predict real estate prices. Competitors will rely on a vast dataset that includes housing data and macroeconomic models. An accurate forecasting model provides more confidence to its clients in a volatile economy.
|
Isamu136/custom_diffusion_eval_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: ibot_b_16_embedding
sequence: float32
- name: moco_vitb_imagenet_embeddings_without_last_layer
sequence: float32
- name: clip_vision_l14
sequence: float32
- name: clip_l14
sequence: float32
splits:
- name: train
num_bytes: 200864257.0
num_examples: 64
download_size: 201259767
dataset_size: 200864257.0
---
# Dataset Card for "custom_diffusion_eval_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rohithmr12/med-data | ---
license: mit
---
|
yezhengli9/wmt20-ru-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 694166
num_examples: 991
download_size: 267391
dataset_size: 694166
---
# Dataset Card for "wmt20-ru-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/maryberry_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Maryberry (Lapis Re:LiGHTs)
This is the dataset of Maryberry (Lapis Re:LiGHTs), containing 58 images and their tags.
The core tags of this character are `bangs, purple_eyes, hair_between_eyes, short_hair, blue_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 32.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryberry_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 27.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryberry_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 104 | 46.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryberry_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 32.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryberry_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 104 | 52.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryberry_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maryberry_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, blush, closed_mouth, holding, indoors, skirt, aqua_hair, smile, chair, puffy_short_sleeves, school_uniform, upper_body |
| 1 | 16 |  |  |  |  |  | 1girl, solo, sleeveless, dress, hat, looking_at_viewer, white_headwear, aqua_hair, open_mouth, ribbon, smile, white_thighhighs, sailor_collar |
| 2 | 9 |  |  |  |  |  | 2girls, closed_mouth, bare_shoulders, blush, sleeveless, solo_focus, shirt, smile, black_gloves, blurry, dress, holding_hands, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | closed_mouth | holding | indoors | skirt | aqua_hair | smile | chair | puffy_short_sleeves | school_uniform | upper_body | sleeveless | dress | hat | looking_at_viewer | white_headwear | open_mouth | ribbon | white_thighhighs | sailor_collar | 2girls | bare_shoulders | solo_focus | shirt | black_gloves | blurry | holding_hands |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------------|:----------|:----------|:--------|:------------|:--------|:--------|:----------------------|:-----------------|:-------------|:-------------|:--------|:------|:--------------------|:-----------------|:-------------|:---------|:-------------------|:----------------|:---------|:-----------------|:-------------|:--------|:---------------|:---------|:----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 9 |  |  |  |  |  | | | X | X | | | | | X | | | | | X | X | | X | | | | | | X | X | X | X | X | X | X |
|
awettig/Pile-HackerNews-0.5B-6K-opt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6359132637
num_examples: 81380
- name: test
num_bytes: 64945692
num_examples: 813
download_size: 1710629426
dataset_size: 6424078329
---
# Dataset Card for "Pile-HackerNews-0.5B-6K-opt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VasquesXavier/Ressentiment | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 69808
num_examples: 3
download_size: 45123
dataset_size: 69808
---
# Dataset Card for "Ressentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pipi00pipi/bibi_he | ---
license: openrail
---
|
Multimodal-Fatima/StanfordCars_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': am general hummer suv 2000
'1': acura rl sedan 2012
'2': acura tl sedan 2012
'3': acura tl type-s 2008
'4': acura tsx sedan 2012
'5': acura integra type r 2001
'6': acura zdx hatchback 2012
'7': aston martin v8 vantage convertible 2012
'8': aston martin v8 vantage coupe 2012
'9': aston martin virage convertible 2012
'10': aston martin virage coupe 2012
'11': audi rs 4 convertible 2008
'12': audi a5 coupe 2012
'13': audi tts coupe 2012
'14': audi r8 coupe 2012
'15': audi v8 sedan 1994
'16': audi 100 sedan 1994
'17': audi 100 wagon 1994
'18': audi tt hatchback 2011
'19': audi s6 sedan 2011
'20': audi s5 convertible 2012
'21': audi s5 coupe 2012
'22': audi s4 sedan 2012
'23': audi s4 sedan 2007
'24': audi tt rs coupe 2012
'25': bmw activehybrid 5 sedan 2012
'26': bmw 1 series convertible 2012
'27': bmw 1 series coupe 2012
'28': bmw 3 series sedan 2012
'29': bmw 3 series wagon 2012
'30': bmw 6 series convertible 2007
'31': bmw x5 suv 2007
'32': bmw x6 suv 2012
'33': bmw m3 coupe 2012
'34': bmw m5 sedan 2010
'35': bmw m6 convertible 2010
'36': bmw x3 suv 2012
'37': bmw z4 convertible 2012
'38': bentley continental supersports conv. convertible 2012
'39': bentley arnage sedan 2009
'40': bentley mulsanne sedan 2011
'41': bentley continental gt coupe 2012
'42': bentley continental gt coupe 2007
'43': bentley continental flying spur sedan 2007
'44': bugatti veyron 16.4 convertible 2009
'45': bugatti veyron 16.4 coupe 2009
'46': buick regal gs 2012
'47': buick rainier suv 2007
'48': buick verano sedan 2012
'49': buick enclave suv 2012
'50': cadillac cts-v sedan 2012
'51': cadillac srx suv 2012
'52': cadillac escalade ext crew cab 2007
'53': chevrolet silverado 1500 hybrid crew cab 2012
'54': chevrolet corvette convertible 2012
'55': chevrolet corvette zr1 2012
'56': chevrolet corvette ron fellows edition z06 2007
'57': chevrolet traverse suv 2012
'58': chevrolet camaro convertible 2012
'59': chevrolet hhr ss 2010
'60': chevrolet impala sedan 2007
'61': chevrolet tahoe hybrid suv 2012
'62': chevrolet sonic sedan 2012
'63': chevrolet express cargo van 2007
'64': chevrolet avalanche crew cab 2012
'65': chevrolet cobalt ss 2010
'66': chevrolet malibu hybrid sedan 2010
'67': chevrolet trailblazer ss 2009
'68': chevrolet silverado 2500hd regular cab 2012
'69': chevrolet silverado 1500 classic extended cab 2007
'70': chevrolet express van 2007
'71': chevrolet monte carlo coupe 2007
'72': chevrolet malibu sedan 2007
'73': chevrolet silverado 1500 extended cab 2012
'74': chevrolet silverado 1500 regular cab 2012
'75': chrysler aspen suv 2009
'76': chrysler sebring convertible 2010
'77': chrysler town and country minivan 2012
'78': chrysler 300 srt-8 2010
'79': chrysler crossfire convertible 2008
'80': chrysler pt cruiser convertible 2008
'81': daewoo nubira wagon 2002
'82': dodge caliber wagon 2012
'83': dodge caliber wagon 2007
'84': dodge caravan minivan 1997
'85': dodge ram pickup 3500 crew cab 2010
'86': dodge ram pickup 3500 quad cab 2009
'87': dodge sprinter cargo van 2009
'88': dodge journey suv 2012
'89': dodge dakota crew cab 2010
'90': dodge dakota club cab 2007
'91': dodge magnum wagon 2008
'92': dodge challenger srt8 2011
'93': dodge durango suv 2012
'94': dodge durango suv 2007
'95': dodge charger sedan 2012
'96': dodge charger srt-8 2009
'97': eagle talon hatchback 1998
'98': fiat 500 abarth 2012
'99': fiat 500 convertible 2012
'100': ferrari ff coupe 2012
'101': ferrari california convertible 2012
'102': ferrari 458 italia convertible 2012
'103': ferrari 458 italia coupe 2012
'104': fisker karma sedan 2012
'105': ford f-450 super duty crew cab 2012
'106': ford mustang convertible 2007
'107': ford freestar minivan 2007
'108': ford expedition el suv 2009
'109': ford edge suv 2012
'110': ford ranger supercab 2011
'111': ford gt coupe 2006
'112': ford f-150 regular cab 2012
'113': ford f-150 regular cab 2007
'114': ford focus sedan 2007
'115': ford e-series wagon van 2012
'116': ford fiesta sedan 2012
'117': gmc terrain suv 2012
'118': gmc savana van 2012
'119': gmc yukon hybrid suv 2012
'120': gmc acadia suv 2012
'121': gmc canyon extended cab 2012
'122': geo metro convertible 1993
'123': hummer h3t crew cab 2010
'124': hummer h2 sut crew cab 2009
'125': honda odyssey minivan 2012
'126': honda odyssey minivan 2007
'127': honda accord coupe 2012
'128': honda accord sedan 2012
'129': hyundai veloster hatchback 2012
'130': hyundai santa fe suv 2012
'131': hyundai tucson suv 2012
'132': hyundai veracruz suv 2012
'133': hyundai sonata hybrid sedan 2012
'134': hyundai elantra sedan 2007
'135': hyundai accent sedan 2012
'136': hyundai genesis sedan 2012
'137': hyundai sonata sedan 2012
'138': hyundai elantra touring hatchback 2012
'139': hyundai azera sedan 2012
'140': infiniti g coupe ipl 2012
'141': infiniti qx56 suv 2011
'142': isuzu ascender suv 2008
'143': jaguar xk xkr 2012
'144': jeep patriot suv 2012
'145': jeep wrangler suv 2012
'146': jeep liberty suv 2012
'147': jeep grand cherokee suv 2012
'148': jeep compass suv 2012
'149': lamborghini reventon coupe 2008
'150': lamborghini aventador coupe 2012
'151': lamborghini gallardo lp 570-4 superleggera 2012
'152': lamborghini diablo coupe 2001
'153': land rover range rover suv 2012
'154': land rover lr2 suv 2012
'155': lincoln town car sedan 2011
'156': mini cooper roadster convertible 2012
'157': maybach landaulet convertible 2012
'158': mazda tribute suv 2011
'159': mclaren mp4-12c coupe 2012
'160': mercedes-benz 300-class convertible 1993
'161': mercedes-benz c-class sedan 2012
'162': mercedes-benz sl-class coupe 2009
'163': mercedes-benz e-class sedan 2012
'164': mercedes-benz s-class sedan 2012
'165': mercedes-benz sprinter van 2012
'166': mitsubishi lancer sedan 2012
'167': nissan leaf hatchback 2012
'168': nissan nv passenger van 2012
'169': nissan juke hatchback 2012
'170': nissan 240sx coupe 1998
'171': plymouth neon coupe 1999
'172': porsche panamera sedan 2012
'173': ram c/v cargo van minivan 2012
'174': rolls-royce phantom drophead coupe convertible 2012
'175': rolls-royce ghost sedan 2012
'176': rolls-royce phantom sedan 2012
'177': scion xd hatchback 2012
'178': spyker c8 convertible 2009
'179': spyker c8 coupe 2009
'180': suzuki aerio sedan 2007
'181': suzuki kizashi sedan 2012
'182': suzuki sx4 hatchback 2012
'183': suzuki sx4 sedan 2012
'184': tesla model s sedan 2012
'185': toyota sequoia suv 2012
'186': toyota camry sedan 2012
'187': toyota corolla sedan 2012
'188': toyota 4runner suv 2012
'189': volkswagen golf hatchback 2012
'190': volkswagen golf hatchback 1991
'191': volkswagen beetle hatchback 2012
'192': volvo c30 hatchback 2012
'193': volvo 240 sedan 1993
'194': volvo xc90 suv 2007
'195': smart fortwo convertible 2012
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: LLM_Description_opt175b_downstream_tasks_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: blip_caption_beam_5
dtype: string
- name: Attributes_ViT_L_14_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_stanfordcars
sequence: string
- name: clip_tags_ViT_L_14_with_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_wo_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_16_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_32_simple_specific
dtype: string
- name: clip_tags_ViT_B_32_ensemble_specific
dtype: string
- name: Attributes_ViT_B_16_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
splits:
- name: test
num_bytes: 1016320238.0
num_examples: 8041
download_size: 989991348
dataset_size: 1016320238.0
---
# Dataset Card for "StanfordCars_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-39000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1053437
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_zero_plural | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 14427
num_examples: 194
- name: test
num_bytes: 14640
num_examples: 192
- name: train
num_bytes: 114261
num_examples: 1495
download_size: 73109
dataset_size: 143328
---
# Dataset Card for "MULTI_VALUE_cola_zero_plural"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.3_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43586042
num_examples: 18929
- name: epoch_1
num_bytes: 44099890
num_examples: 18929
- name: epoch_2
num_bytes: 44209292
num_examples: 18929
- name: epoch_3
num_bytes: 44251605
num_examples: 18929
- name: epoch_4
num_bytes: 44274741
num_examples: 18929
- name: epoch_5
num_bytes: 44289273
num_examples: 18929
- name: epoch_6
num_bytes: 44296482
num_examples: 18929
- name: epoch_7
num_bytes: 44303398
num_examples: 18929
- name: epoch_8
num_bytes: 44306950
num_examples: 18929
- name: epoch_9
num_bytes: 44308769
num_examples: 18929
- name: epoch_10
num_bytes: 44311035
num_examples: 18929
- name: epoch_11
num_bytes: 44310924
num_examples: 18929
- name: epoch_12
num_bytes: 44312195
num_examples: 18929
- name: epoch_13
num_bytes: 44313444
num_examples: 18929
- name: epoch_14
num_bytes: 44313855
num_examples: 18929
- name: epoch_15
num_bytes: 44313210
num_examples: 18929
- name: epoch_16
num_bytes: 44314927
num_examples: 18929
- name: epoch_17
num_bytes: 44315070
num_examples: 18929
- name: epoch_18
num_bytes: 44315919
num_examples: 18929
- name: epoch_19
num_bytes: 44315408
num_examples: 18929
- name: epoch_20
num_bytes: 44315491
num_examples: 18929
- name: epoch_21
num_bytes: 44316046
num_examples: 18929
- name: epoch_22
num_bytes: 44315262
num_examples: 18929
- name: epoch_23
num_bytes: 44315781
num_examples: 18929
- name: epoch_24
num_bytes: 44316326
num_examples: 18929
- name: epoch_25
num_bytes: 44315765
num_examples: 18929
- name: epoch_26
num_bytes: 44316179
num_examples: 18929
- name: epoch_27
num_bytes: 44316692
num_examples: 18929
- name: epoch_28
num_bytes: 44316878
num_examples: 18929
- name: epoch_29
num_bytes: 44317006
num_examples: 18929
download_size: 699305585
dataset_size: 1328223855
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
oreva/squad_30_percent_pruned_by_ppl_gpt2-medium | ---
configs:
- config_name: default
data_files:
- split: top_ppl
path: data/top_ppl-*
- split: bottom_ppl
path: data/bottom_ppl-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
- name: prompt
dtype: string
- name: ppl_gpt2-medium
dtype: float64
splits:
- name: top_ppl
num_bytes: 42784619
num_examples: 23126
- name: bottom_ppl
num_bytes: 38869155
num_examples: 23126
download_size: 49882177
dataset_size: 81653774
---
# Dataset Card for "squad_30_percent_pruned_by_ppl_gpt2-medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ywan111/macbook-dataset | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 82755.0
num_examples: 1
download_size: 83469
dataset_size: 82755.0
---
|
bodonodon/colabunny | ---
license: afl-3.0
---
|
ytzi/the-stack-dedup-python-scored | ---
dataset_info:
config_name: python
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: count_classes
dtype: int64
- name: score_classes
dtype: float64
- name: count_generators
dtype: int64
- name: score_generators
dtype: float64
- name: count_decorators
dtype: int64
- name: score_decorators
dtype: float64
- name: count_async_functions
dtype: int64
- name: score_async_functions
dtype: float64
- name: count_documentation
dtype: int64
- name: score_documentation
dtype: float64
splits:
- name: train
num_bytes: 72919338116
num_examples: 12962249
download_size: 28959409073
dataset_size: 72919338116
configs:
- config_name: python
data_files:
- split: train
path: python/train-*
---
|
ASIDS/alpaca-cleaned-ru |
---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: iteration
dtype: uint32
splits:
- name: train
num_bytes: 74829755.0
num_examples: 51760
download_size: 36596664
dataset_size: 74829755.0
license: cc-by-4.0
language:
- ru
multilinguality:
- monolingual
tags:
- instruction-finetuning
pretty_name: alpaca-cleaned-ru
task_categories:
- text-generation
size_categories:
- 10K<n<100K
source_datasets:
- yahma/alpaca-cleaned
language_creators:
- translated
---
# alpaca-cleaned-ru
converter for autotrain from [d0rj/alpaca-cleaned-ru](https://huggingface.co/datasets/d0rj/alpaca-cleaned-ru)
Translated version of [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) into Russian.
## Dataset Description
- **Repository:** https://github.com/gururise/AlpacaDataCleaned
- **Repository:** https://huggingface.co/datasets/d0rj/alpaca-cleaned-ru |
ShankarSaumil/ArakooAI_Task_Flan-v2 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 351820999.11881727
num_examples: 212520
- name: validation
num_bytes: 476643.24302443425
num_examples: 257
download_size: 847814029
dataset_size: 352297642.3618417
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Saulons3/cocina_final | ---
license: apache-2.0
dataset_info:
features:
- name: column0
dtype: string
- name: column1
dtype: string
splits:
- name: train
num_bytes: 40574
num_examples: 121
download_size: 21456
dataset_size: 40574
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Eduardovco/edu2 | ---
license: openrail
---
|
tianleliphoebe/DreamEditBench | ---
license: cc-by-4.0
task_categories:
- image-to-image
- text-to-image
language:
- en
size_categories:
- n<1K
---
## DreamEditBench for Subject Replacement task and Subject Addition task.
## Dataset Description
- **Homepage:** https://dreameditbenchteam.github.io
- **Repository:** https://github.com/DreamEditBenchTeam/DreamEdit
<!-- **Paper:** https://arxiv.org/abs/2306.12624 -->
The goal of subject replacement is to replace a subject from a source image with a customized subject. In contrast, the aim of the subject addition task is to add a customized
subject to a desired position in the source image. To standardize the evaluation of the two proposed tasks, we curate a new benchmark, i.e. DreamEditBench, consisting of 22 subjects in alignment with DreamBooth with 20 images for each subject correspondingly. For the subject replacement task, we collect 10 images for each type, which include same-typed source subjects in diverse environments. The images are retrieved from the
internet with the search query “a photo of [Class name]”, and the source subject should be the main subject in
the image which dominates a major part of the photo. For the subject addition task, we collect 10 reasonable
backgrounds for each type of subject. In the meantime, we manually designate the
specific location the target subject should be placed with a bounding box in the background. To collect the
specific backgrounds for each subject, we first brainstorm and list the possible common environments of the
subjects, then we search the listed keywords from the internet to retrieve and pick the backgrounds
## Data Structure
There are 22 subject folders in each task folder respectively. In each subject folder, there are 10 source images. For Subject Addition task, there is an additional bbox.json file recording the manually labeled bounding box for each background.
The replacement_subset.csv and addition_subset.csv record the easy/hard subset division for each task correspondingly.
## Citation Information
If you find this dataset useful, please consider citing our paper:
```
@misc{li2023dreamedit,
title={DreamEdit: Subject-driven Image Editing},
author={Tianle Li and Max Ku and Cong Wei and Wenhu Chen},
year={2023},
eprint={2306.12624},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
Veector2312/osmar1 | ---
license: openrail
---
|
Seanxh/twitter_dataset_1713132324 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21488
num_examples: 53
download_size: 13718
dataset_size: 21488
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DZN111/test | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_217 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1133607480
num_examples: 220890
download_size: 1159089868
dataset_size: 1133607480
---
# Dataset Card for "chunk_217"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
leemeng/ShareGPT90K_ja_1392 | ---
license: cc0-1.0
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 24698698
num_examples: 1392
download_size: 8804954
dataset_size: 24698698
---
|
davanstrien/cosmopedia_chat | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
- name: title
dtype: string
- name: generated_text
dtype: string
splits:
- name: train
num_bytes: 5362254
num_examples: 1188
download_size: 1902915
dataset_size: 5362254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- en
tags:
- synthetic
pretty_name: Cosmopedia Chat
size_categories:
- 1K<n<10K
---
# Dataset Card for Cosmopedia Chat
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/60107b385ac3e86b3ea4fc34/6mMBW7gBurVT6kYpjX9L8.png" alt="Your Image" width="500">
</p>
## Dataset Details
### Dataset Description
Docs are WIP!
Rough steps to produce this data.
- Start with [HuggingFaceTB/cosmopedia](https://huggingface.co/datasets/HuggingFaceTB/cosmopedia) dataset
- Select `khanacademy` config
- filter by text length
- remove some examples with certain text i.e. responses starting with "sure"
- Extract a title from the original prompt in the dataset
- Pass titlte + text to [NousResearch/Genstruct-7B](https://huggingface.co/NousResearch/Genstruct-7B) to create user/chat pairs from this title + text context
- Profit??
TODO:
- More curation of what data is included to start. Some of the Cosmpoedia data is not very "stand alone"
- Try and remove topics that are unlikely to be useful for chat data
- Remove bad generations
- Parse generations into users/assistant chat format.
|
justinian336/salvadoran-news-edh | ---
dataset_info:
features:
- name: image_src
dtype: string
- name: title
dtype: string
- name: content
dtype: string
- name: category
dtype:
class_label:
names:
'0': opinion
'1': noticias
'2': videos
'3': entretenimiento
'4': vida
'5': deportes/zona-mundialista
'6': opinion/caricaturas
'7': fotogalerias
'8': null
'9': deportes
- name: link
dtype: string
splits:
- name: train
num_bytes: 196407515
num_examples: 55345
download_size: 111585522
dataset_size: 196407515
---
# Dataset Card for "salvadoran-news-edh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_v2_es_dev | ---
pretty_name: '`mmarco/v2/es/dev`'
viewer: false
source_datasets: ['irds/mmarco_v2_es']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/es/dev`
The `mmarco/v2/es/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/es/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_v2_es`](https://huggingface.co/datasets/irds/mmarco_v2_es)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_v2_es_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_v2_es_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
open-llm-leaderboard/details_TW3Partners__testmerge-7b | ---
pretty_name: Evaluation run of TW3Partners/testmerge-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TW3Partners/testmerge-7b](https://huggingface.co/TW3Partners/testmerge-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TW3Partners__testmerge-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T11:52:55.740432](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3Partners__testmerge-7b/blob/main/results_2024-04-15T11-52-55.740432.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471497788929994,\n\
\ \"acc_stderr\": 0.03215584016671172,\n \"acc_norm\": 0.6466881410376628,\n\
\ \"acc_norm_stderr\": 0.032826608517522254,\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546863,\n \"mc2\": 0.7466999166295964,\n\
\ \"mc2_stderr\": 0.014498007351803632\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7226962457337884,\n \"acc_stderr\": 0.013082095839059374,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288692\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7349133638717387,\n\
\ \"acc_stderr\": 0.0044047727357659884,\n \"acc_norm\": 0.8937462656841266,\n\
\ \"acc_norm_stderr\": 0.0030753230104084216\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139402,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139402\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546863,\n \"mc2\": 0.7466999166295964,\n\
\ \"mc2_stderr\": 0.014498007351803632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250676\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6459438968915845,\n \
\ \"acc_stderr\": 0.01317272838522258\n }\n}\n```"
repo_url: https://huggingface.co/TW3Partners/testmerge-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-52-55.740432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-52-55.740432.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- '**/details_harness|winogrande|5_2024-04-15T11-52-55.740432.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T11-52-55.740432.parquet'
- config_name: results
data_files:
- split: 2024_04_15T11_52_55.740432
path:
- results_2024-04-15T11-52-55.740432.parquet
- split: latest
path:
- results_2024-04-15T11-52-55.740432.parquet
---
# Dataset Card for Evaluation run of TW3Partners/testmerge-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TW3Partners/testmerge-7b](https://huggingface.co/TW3Partners/testmerge-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TW3Partners__testmerge-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T11:52:55.740432](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3Partners__testmerge-7b/blob/main/results_2024-04-15T11-52-55.740432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471497788929994,
"acc_stderr": 0.03215584016671172,
"acc_norm": 0.6466881410376628,
"acc_norm_stderr": 0.032826608517522254,
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546863,
"mc2": 0.7466999166295964,
"mc2_stderr": 0.014498007351803632
},
"harness|arc:challenge|25": {
"acc": 0.7226962457337884,
"acc_stderr": 0.013082095839059374,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288692
},
"harness|hellaswag|10": {
"acc": 0.7349133638717387,
"acc_stderr": 0.0044047727357659884,
"acc_norm": 0.8937462656841266,
"acc_norm_stderr": 0.0030753230104084216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139402,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139402
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546863,
"mc2": 0.7466999166295964,
"mc2_stderr": 0.014498007351803632
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250676
},
"harness|gsm8k|5": {
"acc": 0.6459438968915845,
"acc_stderr": 0.01317272838522258
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MrM0dZ/Samples | ---
license: openrail
---
|
Nexdata/48_Categories_307776_Images_of_Scene_Classification_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
48 Categories - 307,776 Images of Scene Classification Data. The data diversity includes multiple scenes, different photographic angles. The dataset can be used for tasks such as scene classification.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1143?source=Huggingface
## Data size
48 categories, including 15 sub-categories, a total of 307,776 images
## Data diversity
multiple scenes, different photographic angles
## Collecting time
day, night
## Data format
.jpg, .png, .jpeg
## Accuracy
the accuracy of classification of image is more than 95%
# Licensing Information
Commercial License
|
adityarra07/train_24000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 3108325777.5861845
num_examples: 23322
- name: test
num_bytes: 26655739.452758636
num_examples: 200
download_size: 3090623993
dataset_size: 3134981517.0389433
---
# Dataset Card for "train_24000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kevinautomation/llme2_sft_dataset_rlaif_2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6159
num_examples: 5
download_size: 11485
dataset_size: 6159
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dream | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: dream
pretty_name: DREAM
dataset_info:
features:
- name: id
dtype: int32
- name: dialogue_id
dtype: string
- name: dialogue
sequence: string
- name: question
dtype: string
- name: choice
sequence: string
- name: answer
dtype: string
config_name: plain_text
splits:
- name: train
num_bytes: 4775235
num_examples: 6116
- name: validation
num_bytes: 1539272
num_examples: 2040
- name: test
num_bytes: 1556379
num_examples: 2041
download_size: 5558190
dataset_size: 7870886
---
# Dataset Card for DREAM
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
tolgadev/turkish_73k_instruct_extended | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 46016105
num_examples: 73124
download_size: 23400866
dataset_size: 46016105
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- tr
size_categories:
- 10K<n<100K
---
## turkish_73k_instruct_extended
This repository describes a dataset compiled from Turkish instruction-based sources, structured using the LLama Instruct format. Feel free to use this dataset for training and fine-tuning Turkish large language models (LLMs). ⭐
## Merged Datasets
| DatasetName | Link | Licence | numRows |
| ---- | ---- | ---- | ---- |
| [merve/turkish_instructions](https://huggingface.co/datasets/merve/turkish_instructions) | https://huggingface.co/datasets/merve/turkish_instructions | apache-2.0 | 51.6k |
| [tolgadev/ruyatabirleri_instruct](https://huggingface.co/datasets/tolgadev/ruyatabirleri_instruct) | https://huggingface.co/datasets/tolgadev/ruyatabirleri_instruct | apache-2.0 | 8.9k |
| [mertbozkurt/llama2-TR-recipe](https://huggingface.co/datasets/mertbozkurt/llama2-TR-recipe) | https://huggingface.co/datasets/mertbozkurt/llama2-TR-recipe | mit | 10.5k |
| [CausalLM/GPT-4-Self-Instruct-Turkish](https://huggingface.co/datasets/CausalLM/GPT-4-Self-Instruct-Turkish) | https://huggingface.co/datasets/CausalLM/GPT-4-Self-Instruct-Turkish | cc-by-4.0 | 3.08k |
| [emre/stanford-alpaca-cleaned-turkish-translated](https://huggingface.co/datasets/emre/stanford-alpaca-cleaned-turkish-translated) | https://huggingface.co/datasets/emre/stanford-alpaca-cleaned-turkish-translated | afl-3.0 | -|
### Citation
Please ensure to cite all the repositories listed above when using this dataset or code in this repo.
```
@misc{turkish_73k_instruct_extended
author = {Kurtuluş, Tolga},
title = {turkish_73k_instruct_extended},
year = {2024},
publisher = {HuggingFace.co},
journal = {HuggingFace dataset repository},
howpublished = {\url{https://huggingface.co/datasets/tolgadev/turkish_73k_instruct_extended}},
}
``` |
awettig/Pile-ArXiv-0.5B-8K-opt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6500715780
num_examples: 61035
- name: test
num_bytes: 64969880
num_examples: 610
download_size: 1583362590
dataset_size: 6565685660
---
# Dataset Card for "Pile-ArXiv-0.5B-8K-opt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Syed-Hasan-8503/tiny-codes-pretraining | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3580824575
num_examples: 1632309
download_size: 1680133396
dataset_size: 3580824575
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kamyar-zeinalipour/AR_CW | ---
dataset_info:
features:
- name: clue
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 2063175
num_examples: 57706
download_size: 1126121
dataset_size: 2063175
---
# Dataset Card for "AR_CW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giuseppemartino/i-SAID_custom_or_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6362576122.0
num_examples: 840
- name: validation
num_bytes: 905977299.0
num_examples: 99
download_size: 7262651438
dataset_size: 7268553421.0
---
# Dataset Card for "i-SAID_custom_or_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saibo/bookcorpus_compact_256_test_meta | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
- name: cid_arrangement
sequence: int32
- name: schema_lengths
sequence: int64
- name: topic_entity_mask
sequence: int64
- name: text_lengths
sequence: int64
splits:
- name: train
num_bytes: 214680900
num_examples: 6160
download_size: 47705450
dataset_size: 214680900
---
# Dataset Card for "bookcorpus_compact_256_test_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pks3kor/medical_qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-d60b4e7e-7574888 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: OneFly/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: OneFly/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
jhhon80/jhhon | ---
license: openrail
---
|
dura-garage/nep-spell-50k | ---
license: mit
---
|
sheik21/lucas-vocals | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.