datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
knowrohit07/GPTscience_maths_csml | ---
license: other
---
|
visual-layer/vl-food101 | ---
license: other
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheese_plate
'17': cheesecake
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
splits:
- name: train
num_bytes: 3823897663.06
num_examples: 75284
- name: test
num_bytes: 1271132919.55
num_examples: 25150
download_size: 5037088300
dataset_size: 5095030582.61
---
# Description
The vl-food101 dataset by [Visual Layer](https://visual-layer.com) is a sanitized version of the original [Food101](https://data.vision.ee.ethz.ch/cvl/datasets_extra/food-101/) dataset.
This dataset consists of 101 food categories with about 101,000 images in total.
The following are issues found in the original dataset and removed in this dataset:
<table>
<thead>
<tr>
<th style="text-align: left;">Category</th>
<th style="text-align: left;">Percentage</th>
<th style="text-align: left;">Count</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left;">Duplicates</td>
<td style="text-align: left;"><div>0.23%</div></td>
<td style="text-align: left;"><div>235</div></td>
</tr>
<tr>
<td style="text-align: left;">Outliers</td>
<td style="text-align: left;"><div>0.08%</div></td>
<td style="text-align: left;"><div>77</div></td>
</tr>
<tr>
<td style="text-align: left;">Blur</td>
<td style="text-align: left;"><div>0.18%</div></td>
<td style="text-align: left;"><div>185</div></td>
</tr>
<tr>
<td style="text-align: left;">Dark</td>
<td style="text-align: left;"><div>0.04%</div></td>
<td style="text-align: left;"><div>43</div></td>
</tr>
<tr>
<td style="text-align: left;">Leakage</td>
<td style="text-align: left;"><div>0.086%</div></td>
<td style="text-align: left;"><div>87</div></td>
</tr>
<tr>
<td style="text-align: left; font-weight: bold;">Total</td>
<td style="text-align: left; font-weight: bold;"><div>0.62%</div></td>
<td style="text-align: left; font-weight: bold;"><div>627</div></td>
</tr>
</tbody>
</table>
Learn more - https://docs.visual-layer.com/docs/available-datasets#vl-food101
# About Visual-Layer
<div align="center">
<a href="https://www.visual-layer.com">
<img alt="Visual Layer Logo" src="https://github.com/visual-layer/visuallayer/blob/main/imgs/vl_horizontal_logo.png?raw=true" alt="Logo" width="400">
</a>
</div>
Visual Layer is founded by the authors of [XGBoost](https://github.com/apache/tvm), [Apache TVM](https://github.com/apache/tvm) & [Turi Create](https://github.com/apple/turicreate) - [Danny Bickson](https://www.linkedin.com/in/dr-danny-bickson-835b32), [Carlos Guestrin](https://www.linkedin.com/in/carlos-guestrin-5352a869) and [Amir Alush](https://www.linkedin.com/in/amiralush).
Learn more about Visual Layer [here](https://visual-layer.com). |
Mike36Theone/Guyofact | ---
license: openrail
---
|
alvarobartt/logging | ---
dataset_info:
features:
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 351
num_examples: 10
download_size: 1138
dataset_size: 351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jeanlapeno/taco | ---
license: openrail
---
|
rxck/ia | ---
license: openrail
---
|
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03 | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-7B-v2.03
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-7B-v2.03](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-11T01:08:10.868540](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03/blob/main/results_2023-12-11T01-08-10.868540.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6311240267277043,\n\
\ \"acc_stderr\": 0.032483219688012585,\n \"acc_norm\": 0.6342658628547596,\n\
\ \"acc_norm_stderr\": 0.033138776879109225,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.48533561495565075,\n\
\ \"mc2_stderr\": 0.015259691745766833\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578274,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n\
\ \"acc_stderr\": 0.004775079636567097,\n \"acc_norm\": 0.8473411670981876,\n\
\ \"acc_norm_stderr\": 0.0035892328893065146\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782658,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782658\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374294,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374294\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944863,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944863\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560417,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560417\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n\
\ \"acc_stderr\": 0.01590143260893036,\n \"acc_norm\": 0.3452513966480447,\n\
\ \"acc_norm_stderr\": 0.01590143260893036\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440303,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440303\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.48533561495565075,\n\
\ \"mc2_stderr\": 0.015259691745766833\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510432\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.510235026535254,\n \
\ \"acc_stderr\": 0.013769598923012388\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|arc:challenge|25_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|gsm8k|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hellaswag|10_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-08-10.868540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T01-08-10.868540.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- '**/details_harness|winogrande|5_2023-12-11T01-08-10.868540.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-11T01-08-10.868540.parquet'
- config_name: results
data_files:
- split: 2023_12_11T01_08_10.868540
path:
- results_2023-12-11T01-08-10.868540.parquet
- split: latest
path:
- results_2023-12-11T01-08-10.868540.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.03
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.03](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-11T01:08:10.868540](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03/blob/main/results_2023-12-11T01-08-10.868540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6311240267277043,
"acc_stderr": 0.032483219688012585,
"acc_norm": 0.6342658628547596,
"acc_norm_stderr": 0.033138776879109225,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.48533561495565075,
"mc2_stderr": 0.015259691745766833
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578274,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038076
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.004775079636567097,
"acc_norm": 0.8473411670981876,
"acc_norm_stderr": 0.0035892328893065146
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782658,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782658
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374294,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374294
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944863,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944863
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560417,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560417
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.01590143260893036,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.01590143260893036
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440303,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440303
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.48533561495565075,
"mc2_stderr": 0.015259691745766833
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510432
},
"harness|gsm8k|5": {
"acc": 0.510235026535254,
"acc_stderr": 0.013769598923012388
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
5CD-AI/Vietnamese-Multi-turn-Chat-Alpaca | ---
task_categories:
- question-answering
language:
- vi
- en
--- |
ovior/twitter_dataset_1713136480 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2373420
num_examples: 7367
download_size: 1334951
dataset_size: 2373420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ovior/twitter_dataset_1713169290 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2450951
num_examples: 6977
download_size: 1423361
dataset_size: 2450951
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
patruff/chucklesEFT1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 763626
num_examples: 672
- name: test
num_bytes: 191010
num_examples: 168
download_size: 153560
dataset_size: 954636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
huggingface/autotrain-data-lftu-4dli-af43 | Invalid username or password. |
autoevaluate/autoeval-eval-project-banking77-77f5d7e6-1267748583 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- banking77
eval_info:
task: multi_class_classification
model: nickprock/xlm-roberta-base-banking77-classification
metrics: []
dataset_name: banking77
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: nickprock/xlm-roberta-base-banking77-classification
* Dataset: banking77
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nickprock](https://huggingface.co/nickprock) for evaluating this model. |
singhsaurabh/mini-platypus | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CohereForAI/black-box-api-challenges | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
language:
- en
tags:
- toxicity
- text
- nlp
- fairness
pretty_name: On the challenges of using black-box APIs for toxicity evaluation in research
---
# Dataset Card
**Paper**: On the Challenges of Using Black-Box APIs for Toxicity Evaluation in Research
**Abstract**: Perception of toxicity evolves over time and often differs between geographies and cultural backgrounds. Similarly, black-box commercially available APIs for detecting toxicity, such as the Perspective API, are not static, but frequently retrained to address any unattended weaknesses and biases. We evaluate the implications of these changes on the reproducibility of findings that compare the relative merits of models and methods that aim to curb toxicity. Our findings suggest that research that relied on inherited automatic toxicity scores to compare models and techniques may have resulted in inaccurate findings. Rescoring all models from HELM, a widely respected living benchmark, for toxicity with the recent version of the API led to a different ranking of extensively used models. We suggest caution in applying apples-to-apples comparisons between studies and lay recommendations for a more structured approach to evaluating toxicity over time.
Published on the [Trustworthy and Reliable Large-Scale Machine Learning Models ICLR 2023 Workshop](https://rtml-iclr2023.github.io/cfp.html).
[[Code]](https://github.com/for-ai/black-box-api-challenges) [[OpenReview]](https://openreview.net/forum?id=bRDHL4J5vy) [[Extended Pre-print]]()
## Dataset Description
In this repo are the data from the paper "On the challenges of using black-box APIs for toxicity evaluation in research".
In the folders you can find:
- **real-toxicity-prompts:** prompts from the RealToxicityPrompts dataset rescored with Perspective API in February 2023.
- **helm:** prompts and continuations from the HELM benchmark v0.2.2 rescored with Perspective API on April 2023. Also, in that folder we have the original stats from each of the models as scraped from the website.
- **dexperts:** prompts and continuations from a few models from the DExperts paper. Rescored with Perspective API on February 2023.
- **uddia:** continuations from UDDIA models. Rescored with Perspective API on February 2023.
### RealToxicityPrompts
RealToxicityPrompts is a dataset of 100k sentence snippets from the web for researchers to further address the risk of neural toxic degeneration in models.
- **Homepage:** [Toxic Degeneration homepage](https://toxicdegeneration.allenai.org/)
- **Repository:** [Code repository](https://github.com/allenai/real-toxicity-prompts)
- **Paper:** [RealToxicityPrompts: Evaluating Neural Toxic Degeneration in Language Models](https://arxiv.org/abs/2009.11462)
### HELM
- **Homepage:** [HELM Benchmark](https://crfm.stanford.edu/helm/latest/)
- **Repository:** [Code repository](https://github.com/stanford-crfm/helm)
- **Paper:** [Holistic Evaluation of Language Models](https://arxiv.org/abs/2211.09110)
### DExperts
- **Repository:** [Code repository](https://github.com/alisawuffles/DExperts)
- **Paper:** [DExperts: Decoding-Time Controlled Text Generation with Experts and Anti-Experts](https://arxiv.org/abs/2105.03023)
### UDDIA
- **Paper:** [Unified Detoxifying and Debiasing in Language Generation via Inference-time Adaptive Optimization](https://arxiv.org/abs/2210.04492)
# Citation
```
@inproceedings{
pozzobon2023on,
title={On the Challenges of Using Black-Box {API}s for Toxicity Evaluation in Research},
author={Luiza Amador Pozzobon and Beyza Ermis and Patrick Lewis and Sara Hooker},
booktitle={ICLR 2023 Workshop on Trustworthy and Reliable Large-Scale Machine Learning Models },
year={2023},
url={https://openreview.net/forum?id=bRDHL4J5vy}
}
```
|
gokulraj/preondataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
- name: clientId
dtype: string
- name: time
dtype: float64
splits:
- name: train
num_bytes: 6836463.0
num_examples: 310
download_size: 6807860
dataset_size: 6836463.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-security_studies-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 274355
num_examples: 245
download_size: 152071
dataset_size: 274355
---
# Dataset Card for "mmlu-security_studies-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/earn | ---
license: cc-by-sa-4.0
---
|
ali-alkhars/interviews | ---
language:
- en
tags:
- jobs
- interviews
- career
- interview
pretty_name: Interview Questions Dataset
size_categories:
- 1K<n<10K
---
This dataset is used to train LMs to provide software engineering interview questions.
### Dataset Sources
- https://github.com/in28minutes/JavaInterviewQuestionsAndAnswers/blob/master/readme.md
- https://github.com/sudheerj/angular-interview-questions/blob/master/README.md
- https://github.com/sudheerj/vuejs-interview-questions/blob/master/README.md
- https://github.com/sudheerj/reactjs-interview-questions/blob/master/README.md
- https://github.com/sudheerj/javascript-interview-questions/blob/master/README.md
- https://github.com/arialdomartini/Back-End-Developer-Interview-Questions/blob/master/README.md
- https://github.com/ganqqwerty/123-Essential-JavaScript-Interview-Questions/blob/master/README.md
- https://github.com/h5bp/Front-end-Developer-Interview-Questions/tree/main
- https://www.kaggle.com/datasets/syedmharis/software-engineering-interview-questions-dataset
|
bigbio/bioasq_task_b | ---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: NLM_LICENSE
pretty_name: BioASQ Task B
homepage: http://participants-area.bioasq.org/datasets/
bigbio_pubmed: true
bigbio_public: false
bigbio_tasks:
- QUESTION_ANSWERING
---
# Dataset Card for BioASQ Task B
## Dataset Description
- **Homepage:** http://participants-area.bioasq.org/datasets/
- **Pubmed:** True
- **Public:** False
- **Tasks:** QA
The BioASQ corpus contains multiple question
answering tasks annotated by biomedical experts, including yes/no, factoid, list,
and summary questions. Pertaining to our objective of comparing neural language
models, we focus on the the yes/no questions (Task 7b), and leave the inclusion
of other tasks to future work. Each question is paired with a reference text
containing multiple sentences from a PubMed abstract and a yes/no answer. We use
the official train/dev/test split of 670/75/140 questions.
See 'Domain-Specific Language Model Pretraining for Biomedical
Natural Language Processing'
## Citation Information
```
@article{tsatsaronis2015overview,
title = {
An overview of the BIOASQ large-scale biomedical semantic indexing and
question answering competition
},
author = {
Tsatsaronis, George and Balikas, Georgios and Malakasiotis, Prodromos
and Partalas, Ioannis and Zschunke, Matthias and Alvers, Michael R and
Weissenborn, Dirk and Krithara, Anastasia and Petridis, Sergios and
Polychronopoulos, Dimitris and others
},
year = 2015,
journal = {BMC bioinformatics},
publisher = {BioMed Central Ltd},
volume = 16,
number = 1,
pages = 138
}
```
|
open-llm-leaderboard/details_fionazhang__mistral-environment-all | ---
pretty_name: Evaluation run of fionazhang/mistral-environment-all
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fionazhang/mistral-environment-all](https://huggingface.co/fionazhang/mistral-environment-all)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__mistral-environment-all\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T05:12:37.264031](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-all/blob/main/results_2024-01-16T05-12-37.264031.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2317362073283127,\n\
\ \"acc_stderr\": 0.029930955666961398,\n \"acc_norm\": 0.23270999956714858,\n\
\ \"acc_norm_stderr\": 0.030730298088089192,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.479195628849322,\n\
\ \"mc2_stderr\": 0.016335101476581883\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\
\ \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.01331852846053943\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2590121489743079,\n\
\ \"acc_stderr\": 0.004371969542814559,\n \"acc_norm\": 0.25891256721768574,\n\
\ \"acc_norm_stderr\": 0.004371422731216411\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.01500067437357034,\n\
\ \"mc2\": 0.479195628849322,\n \"mc2_stderr\": 0.016335101476581883\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.48697711128650356,\n\
\ \"acc_stderr\": 0.014047718393997662\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/fionazhang/mistral-environment-all
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|arc:challenge|25_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|gsm8k|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hellaswag|10_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T05-12-37.264031.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- '**/details_harness|winogrande|5_2024-01-16T05-12-37.264031.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T05-12-37.264031.parquet'
- config_name: results
data_files:
- split: 2024_01_16T05_12_37.264031
path:
- results_2024-01-16T05-12-37.264031.parquet
- split: latest
path:
- results_2024-01-16T05-12-37.264031.parquet
---
# Dataset Card for Evaluation run of fionazhang/mistral-environment-all
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fionazhang/mistral-environment-all](https://huggingface.co/fionazhang/mistral-environment-all) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fionazhang__mistral-environment-all",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T05:12:37.264031](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-all/blob/main/results_2024-01-16T05-12-37.264031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2317362073283127,
"acc_stderr": 0.029930955666961398,
"acc_norm": 0.23270999956714858,
"acc_norm_stderr": 0.030730298088089192,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.479195628849322,
"mc2_stderr": 0.016335101476581883
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.01331852846053943
},
"harness|hellaswag|10": {
"acc": 0.2590121489743079,
"acc_stderr": 0.004371969542814559,
"acc_norm": 0.25891256721768574,
"acc_norm_stderr": 0.004371422731216411
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.479195628849322,
"mc2_stderr": 0.016335101476581883
},
"harness|winogrande|5": {
"acc": 0.48697711128650356,
"acc_stderr": 0.014047718393997662
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
seungpyo-hong/persona-kr | ---
dataset_info:
features:
- name: sessions
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 68726191
num_examples: 24167
download_size: 36223436
dataset_size: 68726191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EnzoPrezoto/Katheto | ---
license: openrail
---
|
open-llm-leaderboard/details_Locutusque__SlimHercules-4.0-Mistral-7B-v0.2 | ---
pretty_name: Evaluation run of Locutusque/SlimHercules-4.0-Mistral-7B-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/SlimHercules-4.0-Mistral-7B-v0.2](https://huggingface.co/Locutusque/SlimHercules-4.0-Mistral-7B-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__SlimHercules-4.0-Mistral-7B-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T14:46:29.497498](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__SlimHercules-4.0-Mistral-7B-v0.2/blob/main/results_2024-04-15T14-46-29.497498.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6258811997664652,\n\
\ \"acc_stderr\": 0.032450560325242114,\n \"acc_norm\": 0.6297976109556077,\n\
\ \"acc_norm_stderr\": 0.033104935238970755,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.453335051403178,\n\
\ \"mc2_stderr\": 0.01464962524455778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920295,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.01431209455794671\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6344353714399522,\n\
\ \"acc_stderr\": 0.004806039039008959,\n \"acc_norm\": 0.8353913563035252,\n\
\ \"acc_norm_stderr\": 0.003700690995600888\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n\
\ \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n\
\ \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n\
\ \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"\
acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.031911001928357954,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.031911001928357954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.01448565604166918,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.01448565604166918\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079076,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079076\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.453335051403178,\n\
\ \"mc2_stderr\": 0.01464962524455778\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4533737680060652,\n \
\ \"acc_stderr\": 0.01371247104951545\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/SlimHercules-4.0-Mistral-7B-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-46-29.497498.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-46-29.497498.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- '**/details_harness|winogrande|5_2024-04-15T14-46-29.497498.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T14-46-29.497498.parquet'
- config_name: results
data_files:
- split: 2024_04_15T14_46_29.497498
path:
- results_2024-04-15T14-46-29.497498.parquet
- split: latest
path:
- results_2024-04-15T14-46-29.497498.parquet
---
# Dataset Card for Evaluation run of Locutusque/SlimHercules-4.0-Mistral-7B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/SlimHercules-4.0-Mistral-7B-v0.2](https://huggingface.co/Locutusque/SlimHercules-4.0-Mistral-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__SlimHercules-4.0-Mistral-7B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T14:46:29.497498](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__SlimHercules-4.0-Mistral-7B-v0.2/blob/main/results_2024-04-15T14-46-29.497498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6258811997664652,
"acc_stderr": 0.032450560325242114,
"acc_norm": 0.6297976109556077,
"acc_norm_stderr": 0.033104935238970755,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.453335051403178,
"mc2_stderr": 0.01464962524455778
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920295,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.01431209455794671
},
"harness|hellaswag|10": {
"acc": 0.6344353714399522,
"acc_stderr": 0.004806039039008959,
"acc_norm": 0.8353913563035252,
"acc_norm_stderr": 0.003700690995600888
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.031911001928357954,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.031911001928357954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.01448565604166918,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.01448565604166918
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079076,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.453335051403178,
"mc2_stderr": 0.01464962524455778
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597214
},
"harness|gsm8k|5": {
"acc": 0.4533737680060652,
"acc_stderr": 0.01371247104951545
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
one-sec-cv12/chunk_111 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22166245584.125
num_examples: 230783
download_size: 18016309353
dataset_size: 22166245584.125
---
# Dataset Card for "chunk_111"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TuningAI/Startups_V2 | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- 'startups '
- ecommerce
- tax
- law
--- |
AlienKevin/source_han_sans_ja_extra_light_left_right | ---
license: cc0-1.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/4fd88537 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1342
dataset_size: 180
---
# Dataset Card for "4fd88537"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Doxo/Tul_a | ---
license: artistic-2.0
---
|
CyberHarem/urakaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of urakaze/浦風 (Kantai Collection)
This is the dataset of urakaze/浦風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blue_hair, hair_bun, double_bun, blue_eyes, breasts, large_breasts, long_hair, white_headwear, hat, sailor_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 699.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/urakaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 387.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/urakaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1305 | 899.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/urakaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 619.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/urakaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1305 | 1.27 GiB | [Download](https://huggingface.co/datasets/CyberHarem/urakaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/urakaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1boy, 1girl, blush, elbow_gloves, hetero, serafuku, sleeves_rolled_up, solo_focus, white_gloves, nipples, paizuri, penis, sweat, mosaic_censoring, open_mouth, smile, neckerchief, male_pubic_hair, pov, shirt_lift |
| 1 | 16 |  |  |  |  |  | 1girl, elbow_gloves, serafuku, sleeves_rolled_up, smile, solo, white_gloves, yellow_neckerchief, open_mouth, looking_at_viewer, blush, skirt |
| 2 | 8 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, pleated_skirt, serafuku, simple_background, solo, white_background, white_gloves, yellow_neckerchief, sleeves_rolled_up, sailor_collar, smile, cowboy_shot, open_mouth |
| 3 | 7 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, serafuku, sleeves_rolled_up, solo, upper_body, white_background, white_gloves, yellow_neckerchief, blue_sailor_collar, simple_background, smile, blush, collarbone |
| 4 | 7 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, nipples, nude, simple_background, smile, solo, white_background, white_gloves, blush, navel, open_mouth, full_body, hand_on_hip, standing, striped, thighhighs |
| 5 | 5 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, off_shoulder, solo, yukata, bare_shoulders, fox_mask, collarbone, mask_on_head, open_mouth, sash, sitting, smile |
| 6 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, medium_hair, simple_background, solo, white_background, blue_one-piece_swimsuit, competition_swimsuit, cowboy_shot, open_mouth, twitter_username, alternate_costume, collarbone, cropped_legs, hair_between_eyes, covered_navel, doughnut_hair_bun, smile, two-tone_swimsuit |
| 7 | 12 |  |  |  |  |  | 1girl, detached_collar, fake_animal_ears, looking_at_viewer, rabbit_ears, solo, playboy_bunny, strapless_leotard, blush, bowtie, cowboy_shot, open_mouth, alternate_costume, cleavage, pantyhose, twitter_username, wrist_cuffs, black_leotard, rabbit_tail, simple_background, white_background, gloves, hair_between_eyes, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | elbow_gloves | hetero | serafuku | sleeves_rolled_up | solo_focus | white_gloves | nipples | paizuri | penis | sweat | mosaic_censoring | open_mouth | smile | neckerchief | male_pubic_hair | pov | shirt_lift | solo | yellow_neckerchief | looking_at_viewer | skirt | pleated_skirt | simple_background | white_background | sailor_collar | cowboy_shot | upper_body | blue_sailor_collar | collarbone | nude | navel | full_body | hand_on_hip | standing | striped | thighhighs | cleavage | off_shoulder | yukata | bare_shoulders | fox_mask | mask_on_head | sash | sitting | medium_hair | blue_one-piece_swimsuit | competition_swimsuit | twitter_username | alternate_costume | cropped_legs | hair_between_eyes | covered_navel | doughnut_hair_bun | two-tone_swimsuit | detached_collar | fake_animal_ears | rabbit_ears | playboy_bunny | strapless_leotard | bowtie | pantyhose | wrist_cuffs | black_leotard | rabbit_tail | gloves | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:---------------|:---------|:-----------|:--------------------|:-------------|:---------------|:----------|:----------|:--------|:--------|:-------------------|:-------------|:--------|:--------------|:------------------|:------|:-------------|:-------|:---------------------|:--------------------|:--------|:----------------|:--------------------|:-------------------|:----------------|:--------------|:-------------|:---------------------|:-------------|:-------|:--------|:------------|:--------------|:-----------|:----------|:-------------|:-----------|:---------------|:---------|:-----------------|:-----------|:---------------|:-------|:----------|:--------------|:--------------------------|:-----------------------|:-------------------|:--------------------|:---------------|:--------------------|:----------------|:--------------------|:--------------------|:------------------|:-------------------|:--------------|:----------------|:--------------------|:---------|:------------|:--------------|:----------------|:--------------|:---------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | | X | X | X | | X | X | | X | | | | | | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | X | | X | | X | X | | X | | | | | | X | X | | | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | | X | X | X | | X | X | | X | | | | | | | X | | | | | X | X | X | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | | X | X | X | | | | | X | X | | | | | X | X | | | | | X | | X | | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | X | X | | | | | | | | | | | | X | X | | | | | X | | X | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | X | X | | | | | | | | | | | | X | X | | | | | X | | X | | | X | X | | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | | X | X | | | | | | | | | | | | X | | | | | | X | | X | | | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
gaeunseo/filtered_data_for_first_finetuning_shuffled | ---
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: document_id
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 523381042
num_examples: 218025
download_size: 321641607
dataset_size: 523381042
---
# Dataset Card for "filtered_data_for_first_finetuning_shuffled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hf-vision/chest-xray-pneumonia | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': NORMAL
'1': PNEUMONIA
splits:
- name: train
num_bytes: 3186635036.504
num_examples: 5216
- name: validation
num_bytes: 3030633
num_examples: 16
- name: test
num_bytes: 79062317
num_examples: 624
download_size: 1230487171
dataset_size: 3268727986.504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: cc-by-4.0
---
**Dataset Summary**
* The dataset is organized into 3 folders (train, test, val) and contains subfolders for each image category (Pneumonia/Normal). There are 5,863 X-Ray images (JPEG) and 2 categories (Pneumonia/Normal).
* Chest X-ray images (anterior-posterior) were selected from retrospective cohorts of pediatric patients of one to five years old from Guangzhou Women and Children’s Medical Center, Guangzhou. All chest X-ray imaging was performed as part of patients’ routine clinical care.
* For the analysis of chest x-ray images, all chest radiographs were initially screened for quality control by removing all low quality or unreadable scans. The diagnoses for the images were then graded by two expert physicians before being cleared for training the AI system. In order to account for any grading errors, the evaluation set was also checked by a third expert.
* Summary taken from [Application of the AI System for Pneumonia Detection Using Chest X-Ray Images](https://www.cell.com/cell/fulltext/S0092-8674(18)30154-5?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0092867418301545%3Fshowall%3Dtrue)
* [Dataset source](https://data.mendeley.com/datasets/rscbjbr9sj/2)
**Citation
Citation: Kermany, Daniel; Zhang, Kang; Goldbaum, Michael (2018), “Labeled Optical Coherence Tomography (OCT) and Chest X-Ray Images for Classification”, Mendeley Data, V2, doi: 10.17632/rscbjbr9sj.2 |
CyberHarem/souya_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of souya (Kantai Collection)
This is the dataset of souya (Kantai Collection), containing 172 images and their tags.
The core tags of this character are `brown_hair, long_hair, braid, orange_eyes, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 172 | 135.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/souya_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 172 | 95.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/souya_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 370 | 180.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/souya_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 172 | 125.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/souya_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 370 | 227.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/souya_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/souya_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, pleated_skirt, long_sleeves, white_pantyhose, hooded_jacket, messenger_bag, simple_background, solo, white_background, full_body, orange_jacket, suitcase, white_skirt, coat, machinery, rudder_footwear |
| 1 | 16 |  |  |  |  |  | 1girl, solo, simple_background, white_background, orange_shirt, clothes_writing, official_alternate_costume, upper_body, looking_at_viewer, t-shirt, necklace |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | pleated_skirt | long_sleeves | white_pantyhose | hooded_jacket | messenger_bag | simple_background | solo | white_background | full_body | orange_jacket | suitcase | white_skirt | coat | machinery | rudder_footwear | orange_shirt | clothes_writing | official_alternate_costume | upper_body | looking_at_viewer | t-shirt | necklace |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:---------------|:------------------|:----------------|:----------------|:--------------------|:-------|:-------------------|:------------|:----------------|:-----------|:--------------|:-------|:------------|:------------------|:---------------|:------------------|:-----------------------------|:-------------|:--------------------|:----------|:-----------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X |
|
zolak/twitter_dataset_50_1713090984 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2694166
num_examples: 6716
download_size: 1349981
dataset_size: 2694166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T15:53:08.381645](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-26T15-53-08.381645.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2154991610738255,\n\
\ \"em_stderr\": 0.004210747014430766,\n \"f1\": 0.25919148489932897,\n\
\ \"f1_stderr\": 0.004195696877017449,\n \"acc\": 0.4490387889225113,\n\
\ \"acc_stderr\": 0.01073317504472215\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2154991610738255,\n \"em_stderr\": 0.004210747014430766,\n\
\ \"f1\": 0.25919148489932897,\n \"f1_stderr\": 0.004195696877017449\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1372251705837756,\n \
\ \"acc_stderr\": 0.009477808244600398\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843905\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T15_53_08.381645
path:
- '**/details_harness|drop|3_2023-10-26T15-53-08.381645.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T15-53-08.381645.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T15_53_08.381645
path:
- '**/details_harness|gsm8k|5_2023-10-26T15-53-08.381645.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T15-53-08.381645.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T15_53_08.381645
path:
- '**/details_harness|winogrande|5_2023-10-26T15-53-08.381645.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T15-53-08.381645.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- results_2023-10-10T11-38-23.134636.parquet
- split: 2023_10_26T15_53_08.381645
path:
- results_2023-10-26T15-53-08.381645.parquet
- split: latest
path:
- results_2023-10-26T15-53-08.381645.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T15:53:08.381645](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-26T15-53-08.381645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2154991610738255,
"em_stderr": 0.004210747014430766,
"f1": 0.25919148489932897,
"f1_stderr": 0.004195696877017449,
"acc": 0.4490387889225113,
"acc_stderr": 0.01073317504472215
},
"harness|drop|3": {
"em": 0.2154991610738255,
"em_stderr": 0.004210747014430766,
"f1": 0.25919148489932897,
"f1_stderr": 0.004195696877017449
},
"harness|gsm8k|5": {
"acc": 0.1372251705837756,
"acc_stderr": 0.009477808244600398
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843905
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ramitha/open-australian-legal-qa-results-k0-k1-k3-hybrid-llama | ---
dataset_info:
features:
- name: case_index
dtype: int64
- name: no_rag_pipeline_result
dtype: string
- name: normal_bert_hybrid_snippet_k1_indexes
dtype: string
- name: normal_bert_hybrid_snippet_k1_context
dtype: string
- name: normal_bert_hybrid_snippet_k1_result
dtype: string
- name: normal_bert_hybrid_case_k1_indexes
dtype: string
- name: normal_bert_hybrid_case_k1_context
dtype: string
- name: normal_bert_hybrid_case_k1_result
dtype: string
- name: legal_bert_hybrid_snippet_k1_indexes
dtype: string
- name: legal_bert_hybrid_snippet_k1_context
dtype: string
- name: legal_bert_hybrid_snippet_k1_result
dtype: string
- name: legal_bert_hybrid_case_k1_indexes
dtype: string
- name: legal_bert_hybrid_case_k1_context
dtype: string
- name: legal_bert_hybrid_case_k1_result
dtype: string
- name: angle_bert_hybrid_snippet_k1_indexes
dtype: string
- name: angle_bert_hybrid_snippet_k1_context
dtype: string
- name: angle_bert_hybrid_snippet_k1_result
dtype: string
- name: angle_bert_hybrid_case_k1_indexes
dtype: string
- name: angle_bert_hybrid_case_k1_context
dtype: string
- name: angle_bert_hybrid_case_k1_result
dtype: string
- name: normal_bert_hybrid_snippet_k3_indexes
dtype: string
- name: normal_bert_hybrid_snippet_k3_context
dtype: string
- name: normal_bert_hybrid_snippet_k3_result
dtype: string
- name: normal_bert_hybrid_case_k3_indexes
dtype: string
- name: normal_bert_hybrid_case_k3_context
dtype: string
- name: normal_bert_hybrid_case_k3_result
dtype: string
- name: legal_bert_hybrid_snippet_k3_indexes
dtype: string
- name: legal_bert_hybrid_snippet_k3_context
dtype: string
- name: legal_bert_hybrid_snippet_k3_result
dtype: string
- name: legal_bert_hybrid_case_k3_indexes
dtype: string
- name: legal_bert_hybrid_case_k3_context
dtype: string
- name: legal_bert_hybrid_case_k3_result
dtype: string
- name: angle_bert_hybrid_snippet_k3_indexes
dtype: string
- name: angle_bert_hybrid_snippet_k3_context
dtype: string
- name: angle_bert_hybrid_snippet_k3_result
dtype: string
- name: angle_bert_hybrid_case_k3_indexes
dtype: string
- name: angle_bert_hybrid_case_k3_context
dtype: string
- name: angle_bert_hybrid_case_k3_result
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: original_texts
dtype: string
- name: question_normal_bert_matching_embeddings
dtype: string
- name: question_legal_bert_matching_embeddings
dtype: string
- name: question_angle_bert_matching_embeddings
dtype: string
- name: question_normal_bert_retrieval_embeddings
dtype: string
- name: question_legal_bert_retrieval_embeddings
dtype: string
- name: question_angle_bert_retrieval_embeddings
dtype: string
- name: answer_normal_bert_matching_embeddings
dtype: string
- name: answer_legal_bert_matching_embeddings
dtype: string
- name: answer_angle_bert_matching_embeddings
dtype: string
- name: answer_normal_bert_retrieval_embeddings
dtype: string
- name: answer_legal_bert_retrieval_embeddings
dtype: string
- name: answer_angle_bert_retrieval_embeddings
dtype: string
splits:
- name: w025w040w035
num_bytes: 8779325
num_examples: 35
download_size: 6165854
dataset_size: 8779325
configs:
- config_name: default
data_files:
- split: w025w040w035
path: data/w025w040w035-*
---
|
646e62/skca_metadata | ---
license: gpl-3.0
---
|
ctu-aic/squad-cs | ---
dataset_info:
features:
- name: title
dtype: string
- name: paragraphs
list:
- name: context
dtype: string
- name: qas
list:
- name: answers
list:
- name: answer_start
dtype: int64
- name: text
dtype: string
- name: text_translated
dtype: string
- name: id
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 23768862
num_examples: 442
- name: validation
num_bytes: 3764911
num_examples: 48
download_size: 16100066
dataset_size: 27533773
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
VietAI/vi_pubmed | ---
license: cc
language:
- vi
- en
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: pubmed
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: pubmed22
num_bytes: 44360028980
num_examples: 20087006
download_size: 23041004247
dataset_size: 44360028980
---
# Dataset Summary
20M Vietnamese PubMed biomedical abstracts translated by the [state-of-the-art English-Vietnamese Translation project](https://arxiv.org/abs/2210.05610). The data has been used as unlabeled dataset for [pretraining a Vietnamese Biomedical-domain Transformer model](https://arxiv.org/abs/2210.05598).

image source: [Enriching Biomedical Knowledge for Vietnamese Low-resource Language Through Large-Scale Translation](https://arxiv.org/abs/2210.05598)
# Language
- English: Original biomedical abstracts from [Pubmed](https://www.nlm.nih.gov/databases/download/pubmed_medline_faq.html)
- Vietnamese: Synthetic abstract translated by a [state-of-the-art English-Vietnamese Translation project](https://arxiv.org/abs/2210.05610)
# Dataset Structure
- The English sequences are
- The Vietnamese sequences are
# Source Data - Initial Data Collection and Normalization
https://www.nlm.nih.gov/databases/download/pubmed_medline_faq.html
# Licensing Information
[Courtesy of the U.S. National Library of Medicine.](https://www.nlm.nih.gov/databases/download/terms_and_conditions.html)
# Citation
```
@misc{mtet,
doi = {10.48550/ARXIV.2210.05610},
url = {https://arxiv.org/abs/2210.05610},
author = {Ngo, Chinh and Trinh, Trieu H. and Phan, Long and Tran, Hieu and Dang, Tai and Nguyen, Hieu and Nguyen, Minh and Luong, Minh-Thang},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {MTet: Multi-domain Translation for English and Vietnamese},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
```
@misc{vipubmed,
doi = {10.48550/ARXIV.2210.05598},
url = {https://arxiv.org/abs/2210.05598},
author = {Phan, Long and Dang, Tai and Tran, Hieu and Phan, Vy and Chau, Lam D. and Trinh, Trieu H.},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Enriching Biomedical Knowledge for Vietnamese Low-resource Language Through Large-Scale Translation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` |
open-llm-leaderboard/details_Deathsquad10__TinyLlama-Remix | ---
pretty_name: Evaluation run of Deathsquad10/TinyLlama-Remix
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Deathsquad10/TinyLlama-Remix](https://huggingface.co/Deathsquad10/TinyLlama-Remix)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Deathsquad10__TinyLlama-Remix\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T23:56:01.076134](https://huggingface.co/datasets/open-llm-leaderboard/details_Deathsquad10__TinyLlama-Remix/blob/main/results_2024-01-04T23-56-01.076134.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2755137341639449,\n\
\ \"acc_stderr\": 0.03137650195556653,\n \"acc_norm\": 0.27783507591005674,\n\
\ \"acc_norm_stderr\": 0.032199545340444106,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015039,\n \"mc2\": 0.4053463843159328,\n\
\ \"mc2_stderr\": 0.014958855520062687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28071672354948807,\n \"acc_stderr\": 0.013131238126975584,\n\
\ \"acc_norm\": 0.31143344709897613,\n \"acc_norm_stderr\": 0.013532472099850949\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38498307110137425,\n\
\ \"acc_stderr\": 0.004855968578998728,\n \"acc_norm\": 0.49502091216889066,\n\
\ \"acc_norm_stderr\": 0.004989533998820355\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234102,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517418,\n \"\
acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"\
acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3394495412844037,\n\
\ \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.3394495412844037,\n\
\ \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28921568627450983,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.02991858670779884,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.02991858670779884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596919,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596919\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.040598672469526864,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.040598672469526864\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097173\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.227330779054917,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.227330779054917,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.0148933917352496,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.0148933917352496\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668882,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668882\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.0253895125527299,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.0253895125527299\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279327,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279327\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378977,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.030932858792789855,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.030932858792789855\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.19298245614035087,\n \"acc_stderr\": 0.03026745755489847,\n\
\ \"acc_norm\": 0.19298245614035087,\n \"acc_norm_stderr\": 0.03026745755489847\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015039,\n \"mc2\": 0.4053463843159328,\n\
\ \"mc2_stderr\": 0.014958855520062687\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5540647198105761,\n \"acc_stderr\": 0.01397009348233069\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225274\n }\n}\n```"
repo_url: https://huggingface.co/Deathsquad10/TinyLlama-Remix
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|arc:challenge|25_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|gsm8k|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hellaswag|10_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-56-01.076134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T23-56-01.076134.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- '**/details_harness|winogrande|5_2024-01-04T23-56-01.076134.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T23-56-01.076134.parquet'
- config_name: results
data_files:
- split: 2024_01_04T23_56_01.076134
path:
- results_2024-01-04T23-56-01.076134.parquet
- split: latest
path:
- results_2024-01-04T23-56-01.076134.parquet
---
# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-Remix
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Deathsquad10/TinyLlama-Remix](https://huggingface.co/Deathsquad10/TinyLlama-Remix) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Deathsquad10__TinyLlama-Remix",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T23:56:01.076134](https://huggingface.co/datasets/open-llm-leaderboard/details_Deathsquad10__TinyLlama-Remix/blob/main/results_2024-01-04T23-56-01.076134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2755137341639449,
"acc_stderr": 0.03137650195556653,
"acc_norm": 0.27783507591005674,
"acc_norm_stderr": 0.032199545340444106,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015039,
"mc2": 0.4053463843159328,
"mc2_stderr": 0.014958855520062687
},
"harness|arc:challenge|25": {
"acc": 0.28071672354948807,
"acc_stderr": 0.013131238126975584,
"acc_norm": 0.31143344709897613,
"acc_norm_stderr": 0.013532472099850949
},
"harness|hellaswag|10": {
"acc": 0.38498307110137425,
"acc_stderr": 0.004855968578998728,
"acc_norm": 0.49502091216889066,
"acc_norm_stderr": 0.004989533998820355
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343602,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343602
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234102,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.02991858670779884,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.02991858670779884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596919,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596919
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.040598672469526864,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.040598672469526864
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.227330779054917,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.227330779054917,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.0148933917352496,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.0148933917352496
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113596,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113596
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668882,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668882
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.0253895125527299,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.0253895125527299
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279327,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378977,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.03026745755489847,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.03026745755489847
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015039,
"mc2": 0.4053463843159328,
"mc2_stderr": 0.014958855520062687
},
"harness|winogrande|5": {
"acc": 0.5540647198105761,
"acc_stderr": 0.01397009348233069
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225274
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Andyrasika/banking-marketing | ---
license: openrail
dataset_info:
features:
- name: age
dtype: int64
- name: job
dtype: string
- name: marital
dtype: string
- name: education
dtype: string
- name: default
dtype: string
- name: balance
dtype: int64
- name: housing
dtype: string
- name: loan
dtype: string
- name: contact
dtype: string
- name: day
dtype: int64
- name: month
dtype: string
- name: duration
dtype: int64
- name: campaign
dtype: int64
- name: pdays
dtype: int64
- name: previous
dtype: int64
- name: poutcome
dtype: string
- name: y
dtype: string
splits:
- name: train
num_bytes: 6654353
num_examples: 45211
- name: test
num_bytes: 665707
num_examples: 4521
download_size: 834481
dataset_size: 7320060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
## About Dataset
### Context
Term deposits are a major source of income for a bank. A term deposit is a cash investment held at a financial institution. Your money is invested for an agreed rate of interest over a fixed amount of time, or term. The bank has various outreach plans to sell term deposits to their customers such as email marketing, advertisements, telephonic marketing, and digital marketing.
Telephonic marketing campaigns still remain one of the most effective way to reach out to people. However, they require huge investment as large call centers are hired to actually execute these campaigns. Hence, it is crucial to identify the customers most likely to convert beforehand so that they can be specifically targeted via call.
The data is related to direct marketing campaigns (phone calls) of a Portuguese banking institution. The classification goal is to predict if the client will subscribe to a term deposit (variable y).
Content
The data is related to the direct marketing campaigns of a Portuguese banking institution. The marketing campaigns were based on phone calls. Often, more than one contact to the same client was required, in order to access if the product (bank term deposit) would be ('yes') or not ('no') subscribed by the customer or not. The data folder contains two datasets:-
train.csv: 45,211 rows and 18 columns ordered by date (from May 2008 to November 2010)
test.csv: 4521 rows and 18 columns with 10% of the examples (4521), randomly selected from train.csv
Detailed Column Descriptions
bank client data:
- 1 - age (numeric)
- 2 - job : type of job (categorical: "admin.","unknown","unemployed","management","housemaid","entrepreneur","student",
"blue-collar","self-employed","retired","technician","services")
- 3 - marital : marital status (categorical: "married","divorced","single"; note: "divorced" means divorced or widowed)
- 4 - education (categorical: "unknown","secondary","primary","tertiary")
- 5 - default: has credit in default? (binary: "yes","no")
- 6 - balance: average yearly balance, in euros (numeric)
- 7 - housing: has housing loan? (binary: "yes","no")
- 8 - loan: has personal loan? (binary: "yes","no")
# related with the last contact of the current campaign:
- 9 - contact: contact communication type (categorical: "unknown","telephone","cellular")
- 10 - day: last contact day of the month (numeric)
- 11 - month: last contact month of year (categorical: "jan", "feb", "mar", …, "nov", "dec")
- 12 - duration: last contact duration, in seconds (numeric)
# other attributes:
- 13 - campaign: number of contacts performed during this campaign and for this client (numeric, includes last contact)
- 14 - pdays: number of days that passed by after the client was last contacted from a previous campaign (numeric, -1 means client was not previously contacted)
- 15 - previous: number of contacts performed before this campaign and for this client (numeric)
- 16 - poutcome: outcome of the previous marketing campaign (categorical: "unknown","other","failure","success")
Output variable (desired target):
- 17 - y - has the client subscribed a term deposit? (binary: "yes","no") |
conv_ai | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- conversational
- text-classification
task_ids:
- text-scoring
paperswithcode_id: null
pretty_name: ConvAi
tags:
- evaluating-dialogue-systems
dataset_info:
features:
- name: id
dtype: int32
- name: dialogId
dtype: int32
- name: context
dtype: string
- name: users
list:
- name: userType
dtype: string
- name: id
dtype: string
- name: evaluation
list:
- name: breadth
dtype: int32
- name: userId
dtype: string
- name: quality
dtype: int32
- name: engagement
dtype: int32
- name: thread
list:
- name: evaluation
dtype: int32
- name: text
dtype: string
- name: userId
dtype: string
- name: time
dtype: int32
config_name: conv_ai
splits:
- name: train
num_bytes: 3924265
num_examples: 2778
download_size: 5804611
dataset_size: 3924265
---
# Dataset Card for ConvAi
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
skyprolk/iPhone-Wallpapers | ---
license: unknown
tags:
- art
size_categories:
- n<1K
--- |
heliosprime/twitter_dataset_1712917335 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2968
num_examples: 7
download_size: 6783
dataset_size: 2968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712917335"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dog/fuego-20230215-081313-fc71e8 | ---
tags:
- fuego
fuego:
id: 20230215-081313-fc71e8
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/actlearn-fuego-runner
space_hardware: cpu-basic
---
|
ovior/twitter_dataset_1713039240 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2299794
num_examples: 7122
download_size: 1293060
dataset_size: 2299794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
miss-swan/Website-Segmentation | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 75663616.0
num_examples: 10
download_size: 0
dataset_size: 75663616.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Website-Segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/librispeech_asr | Invalid username or password. |
Faeu/mathew1 | ---
license: openrail
---
|
geekyrakshit/LoL-Dataset | ---
license: unknown
tags:
- computer-vision
---
The LOL dataset is composed of 500 low-light and normal-light image pairs and is divided into 485 training pairs and 15 testing pairs. The low-light images contain noise produced during the photo capture process. Most of the images are indoor scenes. All the images have a resolution of 400×600. The dataset was introduced in the paper [Deep Retinex Decomposition for Low-Light Enhancement](https://arxiv.org/abs/1808.04560v1). |
Phongngo2608/rvl_cdip | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics | ---
pretty_name: Evaluation run of Harshvir/Llama-2-7B-physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Harshvir/Llama-2-7B-physics](https://huggingface.co/Harshvir/Llama-2-7B-physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T20:39:36.366627](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics/blob/main/results_2023-09-17T20-39-36.366627.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03680788590604027,\n\
\ \"em_stderr\": 0.0019282642409219751,\n \"f1\": 0.10780620805369148,\n\
\ \"f1_stderr\": 0.0024191974799882767,\n \"acc\": 0.39476463537886264,\n\
\ \"acc_stderr\": 0.009842042454929716\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03680788590604027,\n \"em_stderr\": 0.0019282642409219751,\n\
\ \"f1\": 0.10780620805369148,\n \"f1_stderr\": 0.0024191974799882767\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07050796057619409,\n \
\ \"acc_stderr\": 0.007051543813983609\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875824\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Harshvir/Llama-2-7B-physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|drop|3_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T20-39-36.366627.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-39-36.366627.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|winogrande|5_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T20-39-36.366627.parquet'
- config_name: results
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- results_2023-08-17T21:02:56.107134.parquet
- split: 2023_09_17T20_39_36.366627
path:
- results_2023-09-17T20-39-36.366627.parquet
- split: latest
path:
- results_2023-09-17T20-39-36.366627.parquet
---
# Dataset Card for Evaluation run of Harshvir/Llama-2-7B-physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Harshvir/Llama-2-7B-physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Harshvir/Llama-2-7B-physics](https://huggingface.co/Harshvir/Llama-2-7B-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:39:36.366627](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics/blob/main/results_2023-09-17T20-39-36.366627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03680788590604027,
"em_stderr": 0.0019282642409219751,
"f1": 0.10780620805369148,
"f1_stderr": 0.0024191974799882767,
"acc": 0.39476463537886264,
"acc_stderr": 0.009842042454929716
},
"harness|drop|3": {
"em": 0.03680788590604027,
"em_stderr": 0.0019282642409219751,
"f1": 0.10780620805369148,
"f1_stderr": 0.0024191974799882767
},
"harness|gsm8k|5": {
"acc": 0.07050796057619409,
"acc_stderr": 0.007051543813983609
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875824
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tbomez/test | ---
license: openrail
---
|
nguyenminh871/BroadleafCommerce_broadleaf_3_0_10_GA | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
splits:
- name: BroadleafCommerce_broadleaf_3_0_10_GA
num_bytes: 6257292
num_examples: 2094
download_size: 1631435
dataset_size: 6257292
---
# Dataset Card for "BroadleafCommerce_broadleaf_3_0_10_GA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaxYuki/Ryota | ---
license: apache-2.0
---
|
projetosoclts/Pauloratz | ---
license: openrail
---
|
yymYYM/stock_trading_QA | ---
task_categories:
- question-answering
language:
- en
tags:
- finance
- trading
size_categories:
- 1K<n<10K
--- |
tasksource/temporal-nli | ---
license: apache-2.0
---
```
@inproceedings{thukral-etal-2021-probing,
title = "Probing Language Models for Understanding of Temporal Expressions",
author = "Thukral, Shivin and
Kukreja, Kunal and
Kavouras, Christian",
booktitle = "Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.blackboxnlp-1.31",
doi = "10.18653/v1/2021.blackboxnlp-1.31",
pages = "396--406",
abstract = "We present three Natural Language Inference (NLI) challenge sets that can evaluate NLI models on their understanding of temporal expressions. More specifically, we probe these models for three temporal properties: (a) the order between points in time, (b) the duration between two points in time, (c) the relation between the magnitude of times specified in different units. We find that although large language models fine-tuned on MNLI have some basic perception of the order between points in time, at large, these models do not have a thorough understanding of the relation between temporal expressions.",
}
``` |
para_crawl | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- bg
- cs
- da
- de
- el
- en
- es
- et
- fi
- fr
- ga
- hr
- hu
- it
- lt
- lv
- mt
- nl
- pl
- pt
- ro
- sk
- sl
- sv
license:
- cc0-1.0
multilinguality:
- translation
pretty_name: ParaCrawl
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: paracrawl
dataset_info:
- config_name: enbg
features:
- name: translation
dtype:
translation:
languages:
- en
- bg
splits:
- name: train
num_bytes: 356532771
num_examples: 1039885
download_size: 103743335
dataset_size: 356532771
- config_name: encs
features:
- name: translation
dtype:
translation:
languages:
- en
- cs
splits:
- name: train
num_bytes: 638068353
num_examples: 2981949
download_size: 196410022
dataset_size: 638068353
- config_name: enda
features:
- name: translation
dtype:
translation:
languages:
- en
- da
splits:
- name: train
num_bytes: 598624306
num_examples: 2414895
download_size: 182804827
dataset_size: 598624306
- config_name: ende
features:
- name: translation
dtype:
translation:
languages:
- en
- de
splits:
- name: train
num_bytes: 3997191986
num_examples: 16264448
download_size: 1307754745
dataset_size: 3997191986
- config_name: enel
features:
- name: translation
dtype:
translation:
languages:
- en
- el
splits:
- name: train
num_bytes: 688069020
num_examples: 1985233
download_size: 193553374
dataset_size: 688069020
- config_name: enes
features:
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: train
num_bytes: 6209466040
num_examples: 21987267
download_size: 1953839527
dataset_size: 6209466040
- config_name: enet
features:
- name: translation
dtype:
translation:
languages:
- en
- et
splits:
- name: train
num_bytes: 201408919
num_examples: 853422
download_size: 70158650
dataset_size: 201408919
- config_name: enfi
features:
- name: translation
dtype:
translation:
languages:
- en
- fi
splits:
- name: train
num_bytes: 524624150
num_examples: 2156069
download_size: 159209242
dataset_size: 524624150
- config_name: enfr
features:
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 9015440258
num_examples: 31374161
download_size: 2827554088
dataset_size: 9015440258
- config_name: enga
features:
- name: translation
dtype:
translation:
languages:
- en
- ga
splits:
- name: train
num_bytes: 104523278
num_examples: 357399
download_size: 29394367
dataset_size: 104523278
- config_name: enhr
features:
- name: translation
dtype:
translation:
languages:
- en
- hr
splits:
- name: train
num_bytes: 247646552
num_examples: 1002053
download_size: 84904103
dataset_size: 247646552
- config_name: enhu
features:
- name: translation
dtype:
translation:
languages:
- en
- hu
splits:
- name: train
num_bytes: 403168065
num_examples: 1901342
download_size: 119784765
dataset_size: 403168065
- config_name: enit
features:
- name: translation
dtype:
translation:
languages:
- en
- it
splits:
- name: train
num_bytes: 3340542050
num_examples: 12162239
download_size: 1066720197
dataset_size: 3340542050
- config_name: enlt
features:
- name: translation
dtype:
translation:
languages:
- en
- lt
splits:
- name: train
num_bytes: 197053694
num_examples: 844643
download_size: 66358392
dataset_size: 197053694
- config_name: enlv
features:
- name: translation
dtype:
translation:
languages:
- en
- lv
splits:
- name: train
num_bytes: 142409870
num_examples: 553060
download_size: 47368967
dataset_size: 142409870
- config_name: enmt
features:
- name: translation
dtype:
translation:
languages:
- en
- mt
splits:
- name: train
num_bytes: 52786023
num_examples: 195502
download_size: 19028352
dataset_size: 52786023
- config_name: ennl
features:
- name: translation
dtype:
translation:
languages:
- en
- nl
splits:
- name: train
num_bytes: 1384042007
num_examples: 5659268
download_size: 420090979
dataset_size: 1384042007
- config_name: enpl
features:
- name: translation
dtype:
translation:
languages:
- en
- pl
splits:
- name: train
num_bytes: 854786500
num_examples: 3503276
download_size: 270427885
dataset_size: 854786500
- config_name: enpt
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 2031891156
num_examples: 8141940
download_size: 638184462
dataset_size: 2031891156
- config_name: enro
features:
- name: translation
dtype:
translation:
languages:
- en
- ro
splits:
- name: train
num_bytes: 518359240
num_examples: 1952043
download_size: 160684751
dataset_size: 518359240
- config_name: ensk
features:
- name: translation
dtype:
translation:
languages:
- en
- sk
splits:
- name: train
num_bytes: 337704729
num_examples: 1591831
download_size: 101307152
dataset_size: 337704729
- config_name: ensl
features:
- name: translation
dtype:
translation:
languages:
- en
- sl
splits:
- name: train
num_bytes: 182399034
num_examples: 660161
download_size: 65037465
dataset_size: 182399034
- config_name: ensv
features:
- name: translation
dtype:
translation:
languages:
- en
- sv
splits:
- name: train
num_bytes: 875576366
num_examples: 3476729
download_size: 275528370
dataset_size: 875576366
---
# Dataset Card for "para_crawl"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://paracrawl.eu/releases.html](https://paracrawl.eu/releases.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 10.36 GB
- **Size of the generated dataset:** 32.90 GB
- **Total amount of disk used:** 43.26 GB
### Dataset Summary
Web-Scale Parallel Corpora for Official European Languages.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### enbg
- **Size of downloaded dataset files:** 103.75 MB
- **Size of the generated dataset:** 356.54 MB
- **Total amount of disk used:** 460.27 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"translation": "{\"bg\": \". “A felirat faragott karnis a bejárat fölött, templom épült 14 Július 1643, A földesúr és felesége Jeremiás Murguleţ, C..."
}
```
#### encs
- **Size of downloaded dataset files:** 196.41 MB
- **Size of the generated dataset:** 638.07 MB
- **Total amount of disk used:** 834.48 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"translation": "{\"cs\": \". “A felirat faragott karnis a bejárat fölött, templom épült 14 Július 1643, A földesúr és felesége Jeremiás Murguleţ, C..."
}
```
#### enda
- **Size of downloaded dataset files:** 182.81 MB
- **Size of the generated dataset:** 598.62 MB
- **Total amount of disk used:** 781.43 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"translation": "{\"da\": \". “A felirat faragott karnis a bejárat fölött, templom épült 14 Július 1643, A földesúr és felesége Jeremiás Murguleţ, C..."
}
```
#### ende
- **Size of downloaded dataset files:** 1.31 GB
- **Size of the generated dataset:** 4.00 GB
- **Total amount of disk used:** 5.30 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"translation": "{\"de\": \". “A felirat faragott karnis a bejárat fölött, templom épült 14 Július 1643, A földesúr és felesége Jeremiás Murguleţ, C..."
}
```
#### enel
- **Size of downloaded dataset files:** 193.56 MB
- **Size of the generated dataset:** 688.07 MB
- **Total amount of disk used:** 881.62 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"translation": "{\"el\": \". “A felirat faragott karnis a bejárat fölött, templom épült 14 Július 1643, A földesúr és felesége Jeremiás Murguleţ, C..."
}
```
### Data Fields
The data fields are the same among all splits.
#### enbg
- `translation`: a multilingual `string` variable, with possible languages including `en`, `bg`.
#### encs
- `translation`: a multilingual `string` variable, with possible languages including `en`, `cs`.
#### enda
- `translation`: a multilingual `string` variable, with possible languages including `en`, `da`.
#### ende
- `translation`: a multilingual `string` variable, with possible languages including `en`, `de`.
#### enel
- `translation`: a multilingual `string` variable, with possible languages including `en`, `el`.
### Data Splits
| name | train |
|------|---------:|
| enbg | 1039885 |
| encs | 2981949 |
| enda | 2414895 |
| ende | 16264448 |
| enel | 1985233 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[Creative Commons CC0 license ("no rights reserved")](https://creativecommons.org/share-your-work/public-domain/cc0/).
### Citation Information
```
@inproceedings{banon-etal-2020-paracrawl,
title = "{P}ara{C}rawl: Web-Scale Acquisition of Parallel Corpora",
author = "Ba{\~n}{\'o}n, Marta and
Chen, Pinzhen and
Haddow, Barry and
Heafield, Kenneth and
Hoang, Hieu and
Espl{\`a}-Gomis, Miquel and
Forcada, Mikel L. and
Kamran, Amir and
Kirefu, Faheem and
Koehn, Philipp and
Ortiz Rojas, Sergio and
Pla Sempere, Leopoldo and
Ram{\'\i}rez-S{\'a}nchez, Gema and
Sarr{\'\i}as, Elsa and
Strelec, Marek and
Thompson, Brian and
Waites, William and
Wiggins, Dion and
Zaragoza, Jaume",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-main.417",
doi = "10.18653/v1/2020.acl-main.417",
pages = "4555--4567",
abstract = "We report on methods to create the largest publicly available parallel corpora by crawling the web, using open source software. We empirically compare alternative methods and publish benchmark data sets for sentence alignment and sentence pair filtering. We also describe the parallel corpora released and evaluate their quality and their usefulness to create machine translation systems.",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset. |
Capstone-S21/DocTamper | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 4286621190
num_examples: 120000
- name: validation
num_bytes: 121038184
num_examples: 2000
- name: test
num_bytes: 963414453
num_examples: 30000
download_size: 21856589007
dataset_size: 5371073827
license: apache-2.0
task_categories:
- image-segmentation
language:
- en
size_categories:
- n<1K
--- |
mcmanaman/autotrain-data-bv78-drc7-u5m4 | ---
dataset_info:
features:
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 402
num_examples: 30
- name: validation
num_bytes: 402
num_examples: 30
download_size: 2486
dataset_size: 804
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-bv78-drc7-u5m4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khushpatel2002/code-messages | ---
license: apache-2.0
---
|
ShuaKang/keyframes_d_d_gripper | ---
dataset_info:
features:
- name: keyframes_image
dtype: image
- name: text
dtype: string
- name: gripper_image
dtype: image
splits:
- name: train
num_bytes: 711583897.5
num_examples: 14638
download_size: 700376995
dataset_size: 711583897.5
---
# Dataset Card for "keyframes_d_d_gripper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdrianM0/bicerano_polymers | ---
license: mit
---
|
thisisanshgupta/CodeAlpacaSmall | ---
license: apache-2.0
---
|
jacobbieker/goes-imerg-42hour | ---
license: mit
---
|
jasion/flare-finqa | ---
license: unknown
---
|
distilled-from-one-sec-cv12/chunk_256 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 894984876
num_examples: 174393
download_size: 909589337
dataset_size: 894984876
---
# Dataset Card for "chunk_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/antique_train | ---
pretty_name: '`antique/train`'
viewer: false
source_datasets: ['irds/antique']
task_categories:
- text-retrieval
---
# Dataset Card for `antique/train`
The `antique/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/antique#antique/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=2,426
- `qrels`: (relevance assessments); count=27,422
- For `docs`, use [`irds/antique`](https://huggingface.co/datasets/irds/antique)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/antique_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/antique_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Hashemi2020Antique,
title={ANTIQUE: A Non-Factoid Question Answering Benchmark},
author={Helia Hashemi and Mohammad Aliannejadi and Hamed Zamani and Bruce Croft},
booktitle={ECIR},
year={2020}
}
```
|
distilled-from-one-sec-cv12/chunk_144 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1096728928
num_examples: 213704
download_size: 1122398770
dataset_size: 1096728928
---
# Dataset Card for "chunk_144"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-00ac2adb-9115202 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cifar10
eval_info:
task: image_multi_class_classification
model: tanlq/vit-base-patch16-224-in21k-finetuned-cifar10
metrics: []
dataset_name: cifar10
dataset_config: plain_text
dataset_split: test
col_mapping:
image: img
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: tanlq/vit-base-patch16-224-in21k-finetuned-cifar10
* Dataset: cifar10
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@davidberg](https://huggingface.co/davidberg) for evaluating this model. |
kaleemWaheed/twitter_dataset_1713228666 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 27577
num_examples: 62
download_size: 14049
dataset_size: 27577
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
laion/laion-30k-fid | Invalid username or password. |
jxie/hiv | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train_0
num_bytes: 1869578
num_examples: 32901
- name: val_0
num_bytes: 256545
num_examples: 4113
- name: test_0
num_bytes: 232200
num_examples: 4113
- name: train_1
num_bytes: 1869578
num_examples: 32901
- name: val_1
num_bytes: 256545
num_examples: 4113
- name: test_1
num_bytes: 232200
num_examples: 4113
- name: train_2
num_bytes: 1869578
num_examples: 32901
- name: val_2
num_bytes: 256545
num_examples: 4113
- name: test_2
num_bytes: 232200
num_examples: 4113
download_size: 2758764
dataset_size: 7074969
---
# Dataset Card for "hiv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/zeta_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zeta/ゼタ (Granblue Fantasy)
This is the dataset of zeta/ゼタ (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, twintails, hairband, large_breasts, bangs, hair_intakes, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 819.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 448.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1276 | 976.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 727.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1276 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zeta_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, smile, thighhighs, midriff, navel, skirt, belt, spear, gauntlets, red_armor, holding, blush |
| 1 | 12 |  |  |  |  |  | 1girl, holding_weapon, looking_at_viewer, solo, smile, white_background, simple_background, red_armor, spear, cleavage, sketch |
| 2 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, simple_background, solo, white_background, blush, gauntlets, red_armor, upper_body, closed_mouth, grin, hair_ornament |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, cleavage, crop_top, looking_at_viewer, midriff, navel, solo, sunglasses, black_skirt, collarbone, eyewear_on_head, pleated_skirt, smile, blush, miniskirt, off_shoulder, polearm, thighhighs, white_background, boots, coat, fur-trimmed_jacket, green_jacket, holding, long_sleeves, open_clothes, open_mouth, tank_top, thighs, zettai_ryouiki |
| 4 | 6 |  |  |  |  |  | black_gloves, cleavage, detached_sleeves, halloween_costume, looking_at_viewer, smile, witch_hat, 1girl, jack-o'-lantern, midriff, navel, official_alternate_costume, pumpkin, bare_shoulders, solo, striped, thighhighs, white_background, candy, halloween_bucket, open_mouth |
| 5 | 7 |  |  |  |  |  | 1girl, bracelet, cleavage, eyewear_on_head, looking_at_viewer, navel, official_alternate_costume, red_bikini, side-tie_bikini_bottom, sunglasses, shawl, solo, hair_flower, o-ring, medium_breasts, open_mouth, thighs, blush, collarbone, grin, polearm |
| 6 | 6 |  |  |  |  |  | 1girl, beach, bracelet, cleavage, day, eyewear_on_head, looking_at_viewer, official_alternate_costume, outdoors, red_bikini, side-tie_bikini_bottom, solo, sunglasses, blush, navel, ocean, smile, blue_sky, hair_flower, bare_shoulders, collarbone, shawl, thighs |
| 7 | 5 |  |  |  |  |  | 1girl, earrings, looking_at_viewer, red_dress, smile, solo, medium_breasts, blush, bracelet, cleavage_cutout, hair_down, sleeveless_dress |
| 8 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, obi, red_kimono, smile, hair_flower, open_mouth, wide_sleeves, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | smile | thighhighs | midriff | navel | skirt | belt | spear | gauntlets | red_armor | holding | blush | holding_weapon | white_background | simple_background | sketch | upper_body | closed_mouth | grin | hair_ornament | bare_shoulders | crop_top | sunglasses | black_skirt | collarbone | eyewear_on_head | pleated_skirt | miniskirt | off_shoulder | polearm | boots | coat | fur-trimmed_jacket | green_jacket | long_sleeves | open_clothes | open_mouth | tank_top | thighs | zettai_ryouiki | black_gloves | detached_sleeves | halloween_costume | witch_hat | jack-o'-lantern | official_alternate_costume | pumpkin | striped | candy | halloween_bucket | bracelet | red_bikini | side-tie_bikini_bottom | shawl | hair_flower | o-ring | medium_breasts | beach | day | outdoors | ocean | blue_sky | earrings | red_dress | cleavage_cutout | hair_down | sleeveless_dress | obi | red_kimono | wide_sleeves | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------|:-------------|:----------|:--------|:--------|:-------|:--------|:------------|:------------|:----------|:--------|:-----------------|:-------------------|:--------------------|:---------|:-------------|:---------------|:-------|:----------------|:-----------------|:-----------|:-------------|:--------------|:-------------|:------------------|:----------------|:------------|:---------------|:----------|:--------|:-------|:---------------------|:---------------|:---------------|:---------------|:-------------|:-----------|:---------|:-----------------|:---------------|:-------------------|:--------------------|:------------|:------------------|:-----------------------------|:----------|:----------|:--------|:-------------------|:-----------|:-------------|:-------------------------|:--------|:--------------|:---------|:-----------------|:--------|:------|:-----------|:--------|:-----------|:-----------|:------------|:------------------|:------------|:-------------------|:------|:-------------|:---------------|:---------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | X | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | X | | | | | | | X | | | | X | | X | X | | | | X | | | | | | | X | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | X | | | | | | | | | X | | X | | X | X | | | | | | | | | | | | | X | | | | | | | X | | | | | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | X | X | X | X | X | | | | |
| 8 | 7 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X |
|
ai-habitat/habitat_test_scenes | ---
license: cc-by-nc-4.0
pretty_name: Habitat Test Scenes
---
# Habitat Test Scenes Dataset
A few lightweight static .glb stages for testing [habitat-sim](https://github.com/facebookresearch/habitat-sim) and [habitat-lab](https://github.com/facebookresearch/habitat-lab) installation and CI without other datasets.
## Contents:
`skokloster-castle.glb` - [Scan from Sketchfab](https://sketchfab.com/3d-models/the-king-s-hall-d18155613363445b9b68c0c67196d98d)
`apartment_0.glb` - [Scan from Replica Dataset](https://github.com/facebookresearch/Replica-Dataset) (geometry decimated for simulation and memory efficiency)
`van-gogh-room.glb` - [Synthetic Asset from Sketchfab](https://sketchfab.com/3d-models/van-gogh-room-311d052a9f034ba8bce55a1a8296b6f9)
`.navmesh` files for simulated agent navigation constraints in Habitat-sim.
|
h-mayorquin/ephy_testing_data | ---
license: unlicense
---
|
open-llm-leaderboard/details_Undi95__Emerald-13B | ---
pretty_name: Evaluation run of Undi95/Emerald-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Emerald-13B](https://huggingface.co/Undi95/Emerald-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Emerald-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T18:27:52.311274](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerald-13B/blob/main/results_2023-10-23T18-27-52.311274.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11566694630872483,\n\
\ \"em_stderr\": 0.0032753085227622833,\n \"f1\": 0.18378460570469723,\n\
\ \"f1_stderr\": 0.003376754461365903,\n \"acc\": 0.4437006222575401,\n\
\ \"acc_stderr\": 0.010610978881102105\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.11566694630872483,\n \"em_stderr\": 0.0032753085227622833,\n\
\ \"f1\": 0.18378460570469723,\n \"f1_stderr\": 0.003376754461365903\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1281273692191054,\n \
\ \"acc_stderr\": 0.009206398549980031\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224176\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/Emerald-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T18_27_52.311274
path:
- '**/details_harness|drop|3_2023-10-23T18-27-52.311274.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T18-27-52.311274.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T18_27_52.311274
path:
- '**/details_harness|gsm8k|5_2023-10-23T18-27-52.311274.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T18-27-52.311274.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-31-23.265550.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-31-23.265550.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T18_27_52.311274
path:
- '**/details_harness|winogrande|5_2023-10-23T18-27-52.311274.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T18-27-52.311274.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_31_23.265550
path:
- results_2023-10-03T17-31-23.265550.parquet
- split: 2023_10_23T18_27_52.311274
path:
- results_2023-10-23T18-27-52.311274.parquet
- split: latest
path:
- results_2023-10-23T18-27-52.311274.parquet
---
# Dataset Card for Evaluation run of Undi95/Emerald-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Emerald-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Emerald-13B](https://huggingface.co/Undi95/Emerald-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Emerald-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T18:27:52.311274](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerald-13B/blob/main/results_2023-10-23T18-27-52.311274.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11566694630872483,
"em_stderr": 0.0032753085227622833,
"f1": 0.18378460570469723,
"f1_stderr": 0.003376754461365903,
"acc": 0.4437006222575401,
"acc_stderr": 0.010610978881102105
},
"harness|drop|3": {
"em": 0.11566694630872483,
"em_stderr": 0.0032753085227622833,
"f1": 0.18378460570469723,
"f1_stderr": 0.003376754461365903
},
"harness|gsm8k|5": {
"acc": 0.1281273692191054,
"acc_stderr": 0.009206398549980031
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224176
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/83cbe6ee | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1331
dataset_size: 184
---
# Dataset Card for "83cbe6ee"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-phpthinh__exampletx-toxic-7252ee-1708159805 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampletx
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: phpthinh/exampletx
dataset_config: toxic
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: phpthinh/exampletx
* Config: toxic
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
norec | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- nb
- nn
- 'no'
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: norec
pretty_name: NoReC
dataset_info:
features:
- name: idx
dtype: string
- name: text
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': ADJ
'1': ADP
'2': ADV
'3': AUX
'4': CCONJ
'5': DET
'6': INTJ
'7': NOUN
'8': NUM
'9': PART
'10': PRON
'11': PROPN
'12': PUNCT
'13': SCONJ
'14': SYM
'15': VERB
'16': X
- name: xpos_tags
sequence: string
- name: feats
sequence: string
- name: head
sequence: string
- name: deprel
sequence: string
- name: deps
sequence: string
- name: misc
sequence: string
splits:
- name: train
num_bytes: 1254757266
num_examples: 680792
- name: validation
num_bytes: 189534106
num_examples: 101106
- name: test
num_bytes: 193801708
num_examples: 101594
download_size: 212492611
dataset_size: 1638093080
---
# Dataset Card for NoReC
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/ltgoslo/norec
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2018/pdf/851.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
This dataset contains Norwegian Review Corpus (NoReC), created for the purpose of training and evaluating models for document-level sentiment analysis. More than 43,000 full-text reviews have been collected from major Norwegian news sources and cover a range of different domains, including literature, movies, video games, restaurants, music and theater, in addition to product reviews across a range of categories. Each review is labeled with a manually assigned score of 1–6, as provided by the rating of the original author.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The sentences in the dataset are in Norwegian (nb, nn, no).
## Dataset Structure
### Data Instances
A sample from training set is provided below:
```
{'deprel': ['det',
'amod',
'cc',
'conj',
'nsubj',
'case',
'nmod',
'cop',
'case',
'case',
'root',
'flat:name',
'flat:name',
'punct'],
'deps': ['None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None'],
'feats': ["{'Gender': 'Masc', 'Number': 'Sing', 'PronType': 'Dem'}",
"{'Definite': 'Def', 'Degree': 'Pos', 'Number': 'Sing'}",
'None',
"{'Definite': 'Def', 'Degree': 'Pos', 'Number': 'Sing'}",
"{'Definite': 'Def', 'Gender': 'Masc', 'Number': 'Sing'}",
'None',
'None',
"{'Mood': 'Ind', 'Tense': 'Pres', 'VerbForm': 'Fin'}",
'None',
'None',
'None',
'None',
'None',
'None'],
'head': ['5',
'5',
'4',
'2',
'11',
'7',
'5',
'11',
'11',
'11',
'0',
'11',
'11',
'11'],
'idx': '000000-02-01',
'lemmas': ['den',
'andre',
'og',
'sist',
'sesong',
'av',
'Rome',
'være',
'ute',
'på',
'DVD',
'i',
'Norge',
'$.'],
'misc': ['None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
"{'SpaceAfter': 'No'}",
'None'],
'pos_tags': [5, 0, 4, 0, 7, 1, 11, 3, 1, 1, 11, 1, 11, 12],
'text': 'Den andre og siste sesongen av Rome er ute på DVD i Norge.',
'tokens': ['Den',
'andre',
'og',
'siste',
'sesongen',
'av',
'Rome',
'er',
'ute',
'på',
'DVD',
'i',
'Norge',
'.'],
'xpos_tags': ['None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None',
'None']}
```
### Data Fields
The data instances have the following fields:
- deprel: [More Information Needed]
- deps: [More Information Needed]
- feats: [More Information Needed]
- head: [More Information Needed]
- idx: index
- lemmas: lemmas of all tokens
- misc: [More Information Needed]
- pos_tags: part of speech tags
- text: text string
- tokens: tokens
- xpos_tags: [More Information Needed]
The part of speech taggs correspond to these labels: "ADJ" (0), "ADP" (1), "ADV" (2), "AUX" (3), "CCONJ" (4), "DET" (5), "INTJ" (6), "NOUN" (7), "NUM" (8), "PART" (9), "PRON" (10), "PROPN" (11), "PUNCT" (12), "SCONJ" (13), "SYM" (14), "VERB" (15), "X" (16),
### Data Splits
The training, validation, and test set contain `680792`, `101106`, and `101594` sentences respectively.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@InProceedings{VelOvrBer18,
author = {Erik Velldal and Lilja {\O}vrelid and
Eivind Alexander Bergem and Cathrine Stadsnes and
Samia Touileb and Fredrik J{\o}rgensen},
title = {{NoReC}: The {N}orwegian {R}eview {C}orpus},
booktitle = {Proceedings of the 11th edition of the
Language Resources and Evaluation Conference},
year = {2018},
address = {Miyazaki, Japan},
pages = {4186--4191}
}
```
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
ksuriuri/AI_Vtuber_Chat_History | ---
license: mit
---
|
DBQ/Net.a.Porter.Product.prices.India | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: India - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Net-a-Porter
dtype: string
- name: '2023-11-08'
dtype: string
- name: IND
dtype: string
- name: USD
dtype: string
- name: KHAITE
dtype: string
- name: CLOTHING
dtype: string
- name: DRESSES
dtype: string
- name: MINI DRESSES
dtype: string
- name: '1647597303269708'
dtype: int64
- name: Janna strapless duchesse cotton-blend satin mini dress
dtype: string
- name: https://www.net-a-porter.com/in/en/shop/product/khaite/clothing/mini-dresses/janna-strapless-duchesse-cotton-blend-satin-mini-dress/1647597303269708
dtype: string
- name: https://www.net-a-porter.com/variants/images/1647597303269708/ou/w1000.jpg
dtype: string
- name: '2094.00'
dtype: float64
- name: '1047.00'
dtype: float64
- name: '1958.84'
dtype: float64
- name: '979.42'
dtype: float64
- name: '1'
dtype: int64
splits:
- name: train
num_bytes: 17965087
num_examples: 44420
download_size: 5762779
dataset_size: 17965087
---
# Net-a-Porter web scraped data
## About the website
In the Asia Pacific region, particularly India, retail industries are witnessing a significant digital transformation. The **Ecommerce industry** is particularly flourishing, revolutionised by advanced technology, the proliferation of smartphones, and improved internet infrastructure. The **online fashion retail** sector is a key player in this surge, making a strong foothold in the world of Ecommerce. **Net-a-Porter**, a premier luxury online fashion retailer operates within this industry. The dataset examined, provides **Ecommerce product-list page (PLP) data** on Net-a-Porters offerings in India, revealing significant insights on product range, consumer preferences, price points and trends in the Indian online luxury fashion retail landscape.
## Link to **dataset**
[India - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20India/r/rec5aEBQ3a7hC1MIp)
|
huggingface/autotrain-data-autotrain-ojuq2-oo4mf | Invalid username or password. |
gurprbebo/BEBO_DS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4947
num_examples: 15
download_size: 3142
dataset_size: 4947
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "BEBO_DS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MMoin/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_TehVenom__Moderator-Chan_GPT-JT-6b | ---
pretty_name: Evaluation run of TehVenom/Moderator-Chan_GPT-JT-6b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/Moderator-Chan_GPT-JT-6b](https://huggingface.co/TehVenom/Moderator-Chan_GPT-JT-6b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Moderator-Chan_GPT-JT-6b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-06T16:05:16.771792](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Moderator-Chan_GPT-JT-6b_public/blob/main/results_2023-11-06T16-05-16.771792.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.0002964962989801249,\n \"f1\": 0.0455861996644295,\n\
\ \"f1_stderr\": 0.001167270115698605,\n \"acc\": 0.33438429175196105,\n\
\ \"acc_stderr\": 0.008229511585752802\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801249,\n\
\ \"f1\": 0.0455861996644295,\n \"f1_stderr\": 0.001167270115698605\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.0031069012664996704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6558800315706393,\n \"acc_stderr\": 0.013352121905005935\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TehVenom/Moderator-Chan_GPT-JT-6b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T05_29_39.737368
path:
- '**/details_harness|drop|3_2023-11-05T05-29-39.737368.parquet'
- split: 2023_11_06T16_05_16.771792
path:
- '**/details_harness|drop|3_2023-11-06T16-05-16.771792.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-06T16-05-16.771792.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T05_29_39.737368
path:
- '**/details_harness|gsm8k|5_2023-11-05T05-29-39.737368.parquet'
- split: 2023_11_06T16_05_16.771792
path:
- '**/details_harness|gsm8k|5_2023-11-06T16-05-16.771792.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-06T16-05-16.771792.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T05_29_39.737368
path:
- '**/details_harness|winogrande|5_2023-11-05T05-29-39.737368.parquet'
- split: 2023_11_06T16_05_16.771792
path:
- '**/details_harness|winogrande|5_2023-11-06T16-05-16.771792.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-06T16-05-16.771792.parquet'
- config_name: results
data_files:
- split: 2023_11_05T05_29_39.737368
path:
- results_2023-11-05T05-29-39.737368.parquet
- split: 2023_11_06T16_05_16.771792
path:
- results_2023-11-06T16-05-16.771792.parquet
- split: latest
path:
- results_2023-11-06T16-05-16.771792.parquet
---
# Dataset Card for Evaluation run of TehVenom/Moderator-Chan_GPT-JT-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/Moderator-Chan_GPT-JT-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/Moderator-Chan_GPT-JT-6b](https://huggingface.co/TehVenom/Moderator-Chan_GPT-JT-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__Moderator-Chan_GPT-JT-6b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-06T16:05:16.771792](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Moderator-Chan_GPT-JT-6b_public/blob/main/results_2023-11-06T16-05-16.771792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801249,
"f1": 0.0455861996644295,
"f1_stderr": 0.001167270115698605,
"acc": 0.33438429175196105,
"acc_stderr": 0.008229511585752802
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801249,
"f1": 0.0455861996644295,
"f1_stderr": 0.001167270115698605
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.0031069012664996704
},
"harness|winogrande|5": {
"acc": 0.6558800315706393,
"acc_stderr": 0.013352121905005935
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Neel-Gupta/minipile-processed_512 | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 41620944144
num_examples: 6609
- name: test
num_bytes: 396749808
num_examples: 63
download_size: 4113650040
dataset_size: 42017693952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pritamdeka/dataset_cyner_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3795169
num_examples: 5250
download_size: 1090344
dataset_size: 3795169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlekseyKorshuk/SHP-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 69458847
num_examples: 43269
download_size: 41755344
dataset_size: 69458847
---
# Dataset Card for "SHP-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-edw2 | ---
pretty_name: Evaluation run of YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2](https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-edw2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T02:13:46.736607](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-edw2/blob/main/results_2024-01-26T02-13-46.736607.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538106805602971,\n\
\ \"acc_stderr\": 0.032094755743030716,\n \"acc_norm\": 0.6535994448450703,\n\
\ \"acc_norm_stderr\": 0.03275904226313868,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6382870589790935,\n\
\ \"mc2_stderr\": 0.015166296712442236\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.01341751914471641\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6891057558255328,\n\
\ \"acc_stderr\": 0.004619136497359836,\n \"acc_norm\": 0.8732324238199561,\n\
\ \"acc_norm_stderr\": 0.0033203245481454053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.016392221899407082,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.016392221899407082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008564,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008564\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.01273367188034251,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.01273367188034251\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6382870589790935,\n\
\ \"mc2_stderr\": 0.015166296712442236\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047546\n }\n}\n```"
repo_url: https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|arc:challenge|25_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|arc:challenge|25_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|arc:challenge|25_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|gsm8k|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|gsm8k|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|gsm8k|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hellaswag|10_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hellaswag|10_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hellaswag|10_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T01-59-13.262455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-05-09.255481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-13-46.736607.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T02-13-46.736607.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- '**/details_harness|winogrande|5_2024-01-26T01-59-13.262455.parquet'
- split: 2024_01_26T02_05_09.255481
path:
- '**/details_harness|winogrande|5_2024-01-26T02-05-09.255481.parquet'
- split: 2024_01_26T02_13_46.736607
path:
- '**/details_harness|winogrande|5_2024-01-26T02-13-46.736607.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T02-13-46.736607.parquet'
- config_name: results
data_files:
- split: 2024_01_26T01_59_13.262455
path:
- results_2024-01-26T01-59-13.262455.parquet
- split: 2024_01_26T02_05_09.255481
path:
- results_2024-01-26T02-05-09.255481.parquet
- split: 2024_01_26T02_13_46.736607
path:
- results_2024-01-26T02-13-46.736607.parquet
- split: latest
path:
- results_2024-01-26T02-13-46.736607.parquet
---
# Dataset Card for Evaluation run of YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2](https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-edw2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-edw2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T02:13:46.736607](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-edw2/blob/main/results_2024-01-26T02-13-46.736607.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538106805602971,
"acc_stderr": 0.032094755743030716,
"acc_norm": 0.6535994448450703,
"acc_norm_stderr": 0.03275904226313868,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6382870589790935,
"mc2_stderr": 0.015166296712442236
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.01341751914471641
},
"harness|hellaswag|10": {
"acc": 0.6891057558255328,
"acc_stderr": 0.004619136497359836,
"acc_norm": 0.8732324238199561,
"acc_norm_stderr": 0.0033203245481454053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.016392221899407082,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.016392221899407082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008564,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008564
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.01273367188034251,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.01273367188034251
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6382870589790935,
"mc2_stderr": 0.015166296712442236
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510427
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047546
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Janeway | ---
pretty_name: Evaluation run of KoboldAI/GPT-J-6B-Janeway
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/GPT-J-6B-Janeway](https://huggingface.co/KoboldAI/GPT-J-6B-Janeway)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Janeway\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T15:51:36.283517](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Janeway/blob/main/results_2023-10-21T15-51-36.283517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.04762374161073833,\n\
\ \"f1_stderr\": 0.001208940406482686,\n \"acc\": 0.33042240390432354,\n\
\ \"acc_stderr\": 0.008312737588634883\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n\
\ \"f1\": 0.04762374161073833,\n \"f1_stderr\": 0.001208940406482686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.0031957470754808088\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788958\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/GPT-J-6B-Janeway
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T15_51_36.283517
path:
- '**/details_harness|drop|3_2023-10-21T15-51-36.283517.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T15-51-36.283517.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T15_51_36.283517
path:
- '**/details_harness|gsm8k|5_2023-10-21T15-51-36.283517.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T15-51-36.283517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:54.753616.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:54.753616.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:54.753616.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T15_51_36.283517
path:
- '**/details_harness|winogrande|5_2023-10-21T15-51-36.283517.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T15-51-36.283517.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_39_54.753616
path:
- results_2023-07-19T15:39:54.753616.parquet
- split: 2023_10_21T15_51_36.283517
path:
- results_2023-10-21T15-51-36.283517.parquet
- split: latest
path:
- results_2023-10-21T15-51-36.283517.parquet
---
# Dataset Card for Evaluation run of KoboldAI/GPT-J-6B-Janeway
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/GPT-J-6B-Janeway
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/GPT-J-6B-Janeway](https://huggingface.co/KoboldAI/GPT-J-6B-Janeway) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Janeway",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T15:51:36.283517](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Janeway/blob/main/results_2023-10-21T15-51-36.283517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.04762374161073833,
"f1_stderr": 0.001208940406482686,
"acc": 0.33042240390432354,
"acc_stderr": 0.008312737588634883
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.04762374161073833,
"f1_stderr": 0.001208940406482686
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808088
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788958
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fuyulinh04/dataset_glstxt | ---
dataset_info:
features:
- name: gloss
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11227076.8
num_examples: 73696
- name: test
num_bytes: 2806769.2
num_examples: 18424
download_size: 8513566
dataset_size: 14033846.0
---
# Dataset Card for "dataset_glstxt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_195 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1018868464
num_examples: 200092
download_size: 1032574004
dataset_size: 1018868464
---
# Dataset Card for "chunk_195"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mozilla-foundation/common_voice_2_0 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
br:
- 10K<n<100K
ca:
- 10K<n<100K
cnh:
- 1K<n<10K
cv:
- 1K<n<10K
cy:
- 10K<n<100K
de:
- 100K<n<1M
dv:
- 1K<n<10K
en:
- 100K<n<1M
eo:
- 10K<n<100K
es:
- 10K<n<100K
et:
- 1K<n<10K
eu:
- 10K<n<100K
fr:
- 100K<n<1M
ga-IE:
- 1K<n<10K
it:
- 10K<n<100K
kab:
- 100K<n<1M
ky:
- 10K<n<100K
mn:
- 1K<n<10K
nl:
- 10K<n<100K
ru:
- 10K<n<100K
rw:
- 1K<n<10K
sah:
- 1K<n<10K
sl:
- 1K<n<10K
sv-SE:
- 1K<n<10K
tr:
- 1K<n<10K
tt:
- 10K<n<100K
zh-CN:
- 1K<n<10K
zh-TW:
- 10K<n<100K
source_datasets:
- extended|common_voice
paperswithcode_id: common-voice
pretty_name: Common Voice Corpus 2
language_bcp47:
- br
- ca
- cnh
- cv
- cy
- de
- dv
- en
- eo
- es
- et
- eu
- fr
- ga-IE
- it
- kab
- ky
- mn
- nl
- ru
- rw
- sah
- sl
- sv-SE
- tr
- tt
- zh-CN
- zh-TW
extra_gated_prompt: By clicking on “Access repository” below, you also agree to not
attempt to determine the identity of speakers in the Common Voice dataset.
task_categories:
- automatic-speech-recognition
---
# Dataset Card for Common Voice Corpus 2
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Anton Lozhkov](mailto:anton@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 2366 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 1872 validated hours in 28 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Basque, Breton, Catalan, Chinese (China), Chinese (Taiwan), Chuvash, Dhivehi, Dutch, English, Esperanto, Estonian, French, German, Hakha Chin, Irish, Italian, Kabyle, Kinyarwanda, Kyrgyz, Mongolian, Russian, Sakha, Slovenian, Spanish, Swedish, Tatar, Turkish, Welsh
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_2_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
scene-genie/labeled_instagram-1774 | ---
dataset_info:
features:
- name: user
dtype: string
- name: image_id
dtype: int64
- name: original_image_path
dtype: string
- name: original_image
dtype: image
- name: langsam_res
dtype: string
- name: caption
dtype: string
- name: brand
dtype: string
- name: quality
dtype: string
- name: lifestyle
dtype: bool
- name: product
dtype: bool
- name: text
dtype: bool
- name: frr_image
dtype: image
splits:
- name: train
num_bytes: 464917725.748
num_examples: 1774
download_size: 463202785
dataset_size: 464917725.748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sangmiw/hs | ---
task_categories:
- question-answering
size_categories:
- n<1K
language:
- en
--- |
Emanuel/UD_Portuguese-Bosque | ---
language:
- pt
---
# AutoNLP Dataset for project: pos-tag-bosque
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project pos-tag-bosque.
### Languages
The BCP-47 code for the dataset's language is pt.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tags": [
5,
7,
0
],
"tokens": [
"Um",
"revivalismo",
"refrescante"
]
},
{
"tags": [
5,
11,
11,
11,
3,
5,
7,
1,
5,
7,
0,
12
],
"tokens": [
"O",
"7",
"e",
"Meio",
"\u00e9",
"um",
"ex-libris",
"de",
"a",
"noite",
"algarvia",
"."
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tags": "Sequence(feature=ClassLabel(num_classes=17, names=['ADJ', 'ADP', 'ADV', 'AUX', 'CCONJ', 'DET', 'INTJ', 'NOUN', 'NUM', 'PART', 'PRON', 'PROPN', 'PUNCT', 'SCONJ', 'SYM', 'VERB', 'X'], names_file=None, id=None), length=-1, id=None)",
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 8328 |
| valid | 476 |
|
heliosprime/twitter_dataset_1713036848 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17223
num_examples: 37
download_size: 11801
dataset_size: 17223
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713036848"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kejian/codeparrot-valid-more-filtering-debug | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
- name: hash
dtype: int64
- name: line_mean
dtype: float64
- name: line_max
dtype: int64
- name: alpha_frac
dtype: float64
- name: autogenerated
dtype: bool
- name: ratio
dtype: float64
- name: config_test
dtype: bool
- name: has_no_keywords
dtype: bool
- name: few_assignments
dtype: bool
splits:
- name: train
num_bytes: 957026
num_examples: 100
download_size: 357047
dataset_size: 957026
---
# Dataset Card for "codeparrot-valid-more-filtering-debug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.